WorldWideScience

Sample records for making statistical comparisons

  1. Statistical Group Comparison

    CERN Document Server

    Liao, Tim Futing

    2011-01-01

    An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde

  2. Using statistical inference for decision making in best estimate analyses

    International Nuclear Information System (INIS)

    Sermer, P.; Weaver, K.; Hoppe, F.; Olive, C.; Quach, D.

    2008-01-01

    For broad classes of safety analysis problems, one needs to make decisions when faced with randomly varying quantities which are also subject to errors. The means for doing this involves a statistical approach which takes into account the nature of the physical problems, and the statistical constraints they impose. We describe the methodology for doing this which has been developed at Nuclear Safety Solutions, and we draw some comparisons to other methods which are commonly used in Canada and internationally. Our methodology has the advantages of being robust and accurate and compares favourably to other best estimate methods. (author)

  3. Data Mining and Statistics for Decision Making

    CERN Document Server

    Tufféry, Stéphane

    2011-01-01

    Data mining is the process of automatically searching large volumes of data for models and patterns using computational techniques from statistics, machine learning and information theory; it is the ideal tool for such an extraction of knowledge. Data mining is usually associated with a business or an organization's need to identify trends and profiles, allowing, for example, retailers to discover patterns on which to base marketing objectives. This book looks at both classical and recent techniques of data mining, such as clustering, discriminant analysis, logistic regression, generalized lin

  4. Essays on microeconomics and statistical decision making

    OpenAIRE

    Nieto Barthaburu, Augusto

    2006-01-01

    Chapter I of this dissertation addresses the problem of optimally forecasting a binary variable based on a vector of covariates in the context of two different decision making environments. First we consider a single decision maker with given preferences, who has to choose between two actions on the basis of an unobserved binary outcome. Previous research has shown that traditional prediction methods, such as a logit regression estimated by maximum likelihood and combined with a cutoff, may p...

  5. cocor: a comprehensive solution for the statistical comparison of correlations.

    Directory of Open Access Journals (Sweden)

    Birk Diedenhofen

    Full Text Available A valid comparison of the magnitude of two correlations requires researchers to directly contrast the correlations using an appropriate statistical test. In many popular statistics packages, however, tests for the significance of the difference between correlations are missing. To close this gap, we introduce cocor, a free software package for the R programming language. The cocor package covers a broad range of tests including the comparisons of independent and dependent correlations with either overlapping or nonoverlapping variables. The package also includes an implementation of Zou's confidence interval for all of these comparisons. The platform independent cocor package enhances the R statistical computing environment and is available for scripting. Two different graphical user interfaces-a plugin for RKWard and a web interface-make cocor a convenient and user-friendly tool.

  6. Fuzzy statistical decision-making theory and applications

    CERN Document Server

    Kabak, Özgür

    2016-01-01

    This book offers a comprehensive reference guide to fuzzy statistics and fuzzy decision-making techniques. It provides readers with all the necessary tools for making statistical inference in the case of incomplete information or insufficient data, where classical statistics cannot be applied. The respective chapters, written by prominent researchers, explain a wealth of both basic and advanced concepts including: fuzzy probability distributions, fuzzy frequency distributions, fuzzy Bayesian inference, fuzzy mean, mode and median, fuzzy dispersion, fuzzy p-value, and many others. To foster a better understanding, all the chapters include relevant numerical examples or case studies. Taken together, they form an excellent reference guide for researchers, lecturers and postgraduate students pursuing research on fuzzy statistics. Moreover, by extending all the main aspects of classical statistical decision-making to its fuzzy counterpart, the book presents a dynamic snapshot of the field that is expected to stimu...

  7. Contribution statistics can make to "strengthening forensic science"

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2009-08-01

    Full Text Available draw on inputs from other countries and much of the report is relevant to forensic science in other countries. The report makes thirteen detailed recommendations, several of which will require statistics and statisticians for their implementation...

  8. Robot Trajectories Comparison: A Statistical Approach

    Directory of Open Access Journals (Sweden)

    A. Ansuategui

    2014-01-01

    Full Text Available The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners.

  9. Robot Trajectories Comparison: A Statistical Approach

    Science.gov (United States)

    Ansuategui, A.; Arruti, A.; Susperregi, L.; Yurramendi, Y.; Jauregi, E.; Lazkano, E.; Sierra, B.

    2014-01-01

    The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners. PMID:25525618

  10. Statistical framework for decision making in mine action

    DEFF Research Database (Denmark)

    2007-01-01

    The lecture discusses the basics of statistical decision making in connection with humanitarian mine action. There is special focus on: 1) requirements for mine detection; 2) design and evaluation and confidence of mine equipment; 3) efficient mine action by hierarchical approaches; 4) performance...... improvement by statistical learning and information fusion; 5) the advantage of using combined methods....

  11. Statistical methods for decision making in mine action

    DEFF Research Database (Denmark)

    Larsen, Jan

    The lecture discusses the basics of statistical decision making in connection with humanitarian mine action. There is special focus on: 1) requirements for mine detection; 2) design and evaluation of mine equipment; 3) performance improvement by statistical learning and information fusion; 4...

  12. Decision-making in probability and statistics Chilean curriculum

    DEFF Research Database (Denmark)

    Elicer, Raimundo

    2018-01-01

    Probability and statistics have become prominent subjects in school mathematics curricula. As an exemplary case, I investigate the role of decision making in the justification for probability and statistics in the current Chilean upper secondary mathematics curriculum. For addressing this concern......, I draw upon Fairclough’s model for Critical Discourse Analysis to analyse selected texts as examples of discourse practices. The texts are interconnected with politically driven ideas of stochastics “for all”, the notion of statistical literacy coined by statisticians’ communities, schooling...

  13. What Does Average Really Mean? Making Sense of Statistics

    Science.gov (United States)

    DeAngelis, Karen J.; Ayers, Steven

    2009-01-01

    The recent shift toward greater accountability has put many educational leaders in a position where they are expected to collect and use increasing amounts of data to inform their decision making. Yet, because many programs that prepare administrators, including school business officials, do not require a statistics course or a course that is more…

  14. Statistical framework for decision making in mine action

    DEFF Research Database (Denmark)

    Larsen, Jan

    The lecture discusses the basics of statistical decision making in connection with humanitarian mine action. There is special focus on: 1) requirements for mine detection; 2) design and evaluation and confidence of mine equipment; 3) efficient mine action by hierarchical approaches; 4) performance...

  15. Statistical decision making with a dual-detector probe

    International Nuclear Information System (INIS)

    Hickernell, T.S.

    1988-01-01

    Conventional imaging techniques for cancer detection have difficulty finding small, deep tumors. Single-detector radiation probes have been developed to search for deep lesions in a patient who has been given a tumor-seeking radiopharmaceutical. These probes perform poorly, however, when the background activity in the patient varies greatly from site to site. We have developed a surgical dual-detector probe that solves the problem of background activity variation, by simultaneously monitoring counts from a region of interest and counts from adjacent normal tissue. A comparison of counts from the detectors can reveal the class of tissue, tumor or normal, in the region of interest. In this study, we apply methods from statistical decision theory and derive a suitable comparison of counts to help us decide whether a tumor is present in the region of interest. We use the Hotelling trace criterion with a few assumptions to find a linear discriminant function, which can be reduced to a normalized subtraction of the counts for large background count-rate variations. Using a spatial response map of the dual probe, a computer torso phantom, and estimates of activity distribution, we simulate a surgical staging procedure to test the dual probe and the discriminant functions

  16. Teaching statistics to social science students: Making it valuable ...

    African Journals Online (AJOL)

    In this age of rapid information expansion and technology, statistics is playing an ever increasing role in education, particularly also in the training of social scientists. Statistics enables the social scientist to obtain a quantitative awareness of socioeconomic phenomena hence is essential in their training. Statistics, however ...

  17. Data Warehousing: How To Make Your Statistics Meaningful.

    Science.gov (United States)

    Flaherty, William

    2001-01-01

    Examines how one school district found a way to turn data collection from a disparate mountain of statistics into more useful information by using their Instructional Decision Support System. System software is explained as is how the district solved some data management challenges. (GR)

  18. Statistical methods for decision making in mine action

    DEFF Research Database (Denmark)

    Larsen, Jan

    The design and evaluation of mine clearance equipment – the problem of reliability * Detection probability – tossing a coin * Requirements in mine action * Detection probability and confidence in MA * Using statistics in area reduction Improving performance by information fusion and combination...

  19. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    Science.gov (United States)

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  20. Teaching Statistics in Middle School Mathematics Classrooms: Making Links with Mathematics but Avoiding Statistical Reasoning

    Science.gov (United States)

    Savard, Annie; Manuel, Dominic

    2015-01-01

    Statistics is a domain that is taught in Mathematics in all school levels. We suggest a potential in using an interdisciplinary approach with this concept. Thus the development of the understanding of a situation might mean to use both mathematical and statistical reasoning. In this paper, we present two case studies where two middle school…

  1. Statistical comparison of the geometry of second-phase particles

    Energy Technology Data Exchange (ETDEWEB)

    Benes, Viktor, E-mail: benesv@karlin.mff.cuni.cz [Charles University in Prague, Faculty of Mathematics and Physics, Department of Probability and Mathematical Statistics, Sokolovska 83, 186 75 Prague 8-Karlin (Czech Republic); Lechnerova, Radka, E-mail: radka.lech@seznam.cz [Private College on Economical Studies, Ltd., Lindnerova 575/1, 180 00 Prague 8-Liben (Czech Republic); Klebanov, Lev [Charles University in Prague, Faculty of Mathematics and Physics, Department of Probability and Mathematical Statistics, Sokolovska 83, 186 75 Prague 8-Karlin (Czech Republic); Slamova, Margarita, E-mail: slamova@vyzkum-kovu.cz [Research Institute for Metals, Ltd., Panenske Brezany 50, 250 70 Odolena Voda (Czech Republic); Slama, Peter [Research Institute for Metals, Ltd., Panenske Brezany 50, 250 70 Odolena Voda (Czech Republic)

    2009-10-15

    In microscopic studies of materials, there is often a need to provide a statistical test as to whether two microstructures are different or not. Typically, there are some random objects (particles, grains, pores) and the comparison concerns their density, individual geometrical parameters and their spatial distribution. The problem is that neighbouring objects observed in a single window cannot be assumed to be stochastically independent, therefore classical statistical testing based on random sampling is not applicable. The aim of the present paper is to develop a test based on N-distances in probability theory. Using the measurements from a few independent windows, we consider a two-sample test, which involves a large amount of information collected from each window. An application is presented consisting in a comparison of metallographic samples of aluminium alloys, and the results are interpreted.

  2. Statistical comparison of the geometry of second-phase particles

    International Nuclear Information System (INIS)

    Benes, Viktor; Lechnerova, Radka; Klebanov, Lev; Slamova, Margarita; Slama, Peter

    2009-01-01

    In microscopic studies of materials, there is often a need to provide a statistical test as to whether two microstructures are different or not. Typically, there are some random objects (particles, grains, pores) and the comparison concerns their density, individual geometrical parameters and their spatial distribution. The problem is that neighbouring objects observed in a single window cannot be assumed to be stochastically independent, therefore classical statistical testing based on random sampling is not applicable. The aim of the present paper is to develop a test based on N-distances in probability theory. Using the measurements from a few independent windows, we consider a two-sample test, which involves a large amount of information collected from each window. An application is presented consisting in a comparison of metallographic samples of aluminium alloys, and the results are interpreted.

  3. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    Science.gov (United States)

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. An ANOVA approach for statistical comparisons of brain networks.

    Science.gov (United States)

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  5. More 'mapping' in brain mapping: statistical comparison of effects

    DEFF Research Database (Denmark)

    Jernigan, Terry Lynne; Gamst, Anthony C.; Fennema-Notestine, Christine

    2003-01-01

    The term 'mapping' in the context of brain imaging conveys to most the concept of localization; that is, a brain map is meant to reveal a relationship between some condition or parameter and specific sites within the brain. However, in reality, conventional voxel-based maps of brain function......, or for that matter of brain structure, are generally constructed using analyses that yield no basis for inferences regarding the spatial nonuniformity of the effects. In the normal analysis path for functional images, for example, there is nowhere a statistical comparison of the observed effect in any voxel relative...... to that in any other voxel. Under these circumstances, strictly speaking, the presence of significant activation serves as a legitimate basis only for inferences about the brain as a unit. In their discussion of results, investigators rarely are content to confirm the brain's role, and instead generally prefer...

  6. Why am I not disabled? Making state subjects, making statistics in post--Mao China.

    Science.gov (United States)

    Kohrman, Matthew

    2003-03-01

    In this article I examine how and why disability was defined and statistically quantified by China's party-state in the late 1980s. I describe the unfolding of a particular epidemiological undertaking--China's 1987 National Sample Survey of Disabled Persons--as well as the ways the survey was an extension of what Ian Hacking has called modernity's "avalanche of numbers." I argue that, to a large degree, what fueled and shaped the 1987 survey's codification and quantification of disability was how Chinese officials were incited to shape their own identities as they negotiated an array of social, political, and ethical forces, which were at once national and transnational in orientation.

  7. Instrumental and statistical methods for the comparison of class evidence

    Science.gov (United States)

    Liszewski, Elisa Anne

    Trace evidence is a major field within forensic science. Association of trace evidence samples can be problematic due to sample heterogeneity and a lack of quantitative criteria for comparing spectra or chromatograms. The aim of this study is to evaluate different types of instrumentation for their ability to discriminate among samples of various types of trace evidence. Chemometric analysis, including techniques such as Agglomerative Hierarchical Clustering, Principal Components Analysis, and Discriminant Analysis, was employed to evaluate instrumental data. First, automotive clear coats were analyzed by using microspectrophotometry to collect UV absorption data. In total, 71 samples were analyzed with classification accuracy of 91.61%. An external validation was performed, resulting in a prediction accuracy of 81.11%. Next, fiber dyes were analyzed using UV-Visible microspectrophotometry. While several physical characteristics of cotton fiber can be identified and compared, fiber color is considered to be an excellent source of variation, and thus was examined in this study. Twelve dyes were employed, some being visually indistinguishable. Several different analyses and comparisons were done, including an inter-laboratory comparison and external validations. Lastly, common plastic samples and other polymers were analyzed using pyrolysis-gas chromatography/mass spectrometry, and their pyrolysis products were then analyzed using multivariate statistics. The classification accuracy varied dependent upon the number of classes chosen, but the plastics were grouped based on composition. The polymers were used as an external validation and misclassifications occurred with chlorinated samples all being placed into the category containing PVC.

  8. MISR Aerosol Product Attributes and Statistical Comparisons with MODIS

    Science.gov (United States)

    Kahn, Ralph A.; Nelson, David L.; Garay, Michael J.; Levy, Robert C.; Bull, Michael A.; Diner, David J.; Martonchik, John V.; Paradise, Susan R.; Hansen, Earl G.; Remer, Lorraine A.

    2009-01-01

    In this paper, Multi-angle Imaging SpectroRadiometer (MISR) aerosol product attributes are described, including geometry and algorithm performance flags. Actual retrieval coverage is mapped and explained in detail using representative global monthly data. Statistical comparisons are made with coincident aerosol optical depth (AOD) and Angstrom exponent (ANG) retrieval results from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument. The relationship between these results and the ones previously obtained for MISR and MODIS individually, based on comparisons with coincident ground-truth observations, is established. For the data examined, MISR and MODIS each obtain successful aerosol retrievals about 15% of the time, and coincident MISR-MODIS aerosol retrievals are obtained for about 6%-7% of the total overlap region. Cloud avoidance, glint and oblique-Sun exclusions, and other algorithm physical limitations account for these results. For both MISR and MODIS, successful retrievals are obtained for over 75% of locations where attempts are made. Where coincident AOD retrievals are obtained over ocean, the MISR-MODIS correlation coefficient is about 0.9; over land, the correlation coefficient is about 0.7. Differences are traced to specific known algorithm issues or conditions. Over-ocean ANG comparisons yield a correlation of 0.67, showing consistency in distinguishing aerosol air masses dominated by coarse-mode versus fine-mode particles. Sampling considerations imply that care must be taken when assessing monthly global aerosol direct radiative forcing and AOD trends with these products, but they can be used directly for many other applications, such as regional AOD gradient and aerosol air mass type mapping and aerosol transport model validation. Users are urged to take seriously the published product data-quality statements.

  9. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  10. ASYMPTOTIC COMPARISONS OF U-STATISTICS, V-STATISTICS AND LIMITS OF BAYES ESTIMATES BY DEFICIENCIES

    OpenAIRE

    Toshifumi, Nomachi; Hajime, Yamato; Graduate School of Science and Engineering, Kagoshima University:Miyakonojo College of Technology; Faculty of Science, Kagoshima University

    2001-01-01

    As estimators of estimable parameters, we consider three statistics which are U-statistic, V-statistic and limit of Bayes estimate. This limit of Bayes estimate, called LB-statistic in this paper, is obtained from Bayes estimate of estimable parameter based on Dirichlet process, by letting its parameter tend to zero. For the estimable parameter with non-degenerate kernel, the asymptotic relative efficiencies of LB-statistic with respect to U-statistic and V-statistic and that of V-statistic w...

  11. Urban pavement surface temperature. Comparison of numerical and statistical approach

    Science.gov (United States)

    Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia

    2015-04-01

    The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.

  12. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    Science.gov (United States)

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the

  13. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    Science.gov (United States)

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  14. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    Science.gov (United States)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  15. Comparison of Tsallis statistics with the Tsallis-factorized statistics in the ultrarelativistic pp collisions

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2016-01-01

    The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)

  16. Model Accuracy Comparison for High Resolution Insar Coherence Statistics Over Urban Areas

    Science.gov (United States)

    Zhang, Yue; Fu, Kun; Sun, Xian; Xu, Guangluan; Wang, Hongqi

    2016-06-01

    The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR) images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR) coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  17. MODEL ACCURACY COMPARISON FOR HIGH RESOLUTION INSAR COHERENCE STATISTICS OVER URBAN AREAS

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2016-06-01

    Full Text Available The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  18. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    Science.gov (United States)

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  19. Comparison of stability statistics for yield in barley (Hordeum vulgare ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-03-15

    Mar 15, 2010 ... statistics and yield indicated that only TOP method would be useful for simultaneously selecting for high yield and ... metric stability methods; i) they reduce the bias caused by outliers, ii) ...... Biometrics, 43: 45-53. Sabaghnia N ...

  20. A simple statistical method for catch comparison studies

    DEFF Research Database (Denmark)

    Holst, René; Revill, Andrew

    2009-01-01

    For analysing catch comparison data, we propose a simple method based on Generalised Linear Mixed Models (GLMM) and use polynomial approximations to fit the proportions caught in the test codend. The method provides comparisons of fish catch at length by the two gears through a continuous curve...... with a realistic confidence band. We demonstrate the versatility of this method, on field data obtained from the first known testing in European waters of the Rhode Island (USA) 'Eliminator' trawl. These data are interesting as they include a range of species with different selective patterns. Crown Copyright (C...

  1. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  2. Introducing StatHand: A cross-platform mobile application to support students’ statistical decision making

    Directory of Open Access Journals (Sweden)

    Peter James Allen

    2016-02-01

    Full Text Available Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students’ statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  3. Introducing StatHand: A Cross-Platform Mobile Application to Support Students' Statistical Decision Making.

    Science.gov (United States)

    Allen, Peter J; Roberts, Lynne D; Baughman, Frank D; Loxton, Natalie J; Van Rooy, Dirk; Rock, Adam J; Finlay, James

    2016-01-01

    Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer) are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students' statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  4. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

    Energy Technology Data Exchange (ETDEWEB)

    AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

    1999-05-01

    In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

  5. Petroleum reserach: a statistical comparison and economic outlook

    Energy Technology Data Exchange (ETDEWEB)

    Perrodon, A

    1965-10-01

    The oil wealth of a country or of a sedimentary basin is quite variable. The cumulative quantities drawn to the ''useful'' sedimentary surface may vary from some 10 cu m of oil to many hundreds of thousands per sq km in the richest petroleum-bearing provinces. These results have been obtained after years of exploratory work, especially drilling, which have been carried out for the last 10 or 15 yr. It varies from a few wells to more than one thousand per 10,000 kmU2D. This overall task may give a first approximation of the amount of investments devoted to exploration. A comparison between the results obtained and the means employed reveals different criteria such as the amount of oil or gas discovered per exploratory well, thus giving an idea of the productive capacity of a basin, of the cost of exploration, and of its variation as prospecting proceeds. Thus, the general evolution of oil exploration sets forth, successively, a period of expansion, a phase of maturity, and a period of decline. Such comparisons, which enable various petroliferous provinces to be situated more accurately, must be completed later on by more serious analyses of the geological factors in relation to the origin of these different riches. (18 refs.)

  6. Knowledge fusion: Comparison of fuzzy curve smoothers to statistically motivated curve smoothers

    International Nuclear Information System (INIS)

    Burr, T.; Strittmatter, R.B.

    1996-03-01

    This report describes work during FY 95 that was sponsored by the Department of Energy, Office of Nonproliferation and National Security (NN) Knowledge Fusion (KF) Project. The project team selected satellite sensor data to use as the one main example to which its analysis algorithms would be applied. The specific sensor-fusion problem has many generic features, which make it a worthwhile problem to attempt to solve in a general way. The generic problem is to recognize events of interest from multiple time series that define a possibly noisy background. By implementing a suite of time series modeling and forecasting methods and using well-chosen alarm criteria, we reduce the number of false alarms. We then further reduce the number of false alarms by analyzing all suspicious sections of data, as judged by the alarm criteria, with pattern recognition methods. This report gives a detailed comparison of two of the forecasting methods (fuzzy forecaster and statistically motivated curve smoothers as forecasters). The two methods are compared on five simulated and five real data sets. One of the five real data sets is satellite sensor data. The conclusion is the statistically motivated curve smoother is superior on simulated data of the type we studied. The statistically motivated method is also superior on most real data. In defense of the fuzzy-logic motivated methods, we point out that fuzzy-logic methods were never intended to compete with statistical methods on numeric data. Fuzzy logic was developed to handle real-world situations where either real data was not available or was supplemented with either ''expert opinion'' or some sort of linguistic information

  7. Statistical Learning and Adaptive Decision-Making Underlie Human Response Time Variability in Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Ning eMa

    2015-08-01

    Full Text Available Response time (RT is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task, in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop, and stop-signal onset time, SSD (stop-signal delay, with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop and SSD. The human behavioral data (n=20 bear out this prediction, showing P(stop and SSD both to be significant, independent predictors of RT, with P(stop being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  8. Statistical learning and adaptive decision-making underlie human response time variability in inhibitory control.

    Science.gov (United States)

    Ma, Ning; Yu, Angela J

    2015-01-01

    Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  9. Comparison of de novo assembly statistics of Cucumis sativus L.

    Science.gov (United States)

    Wojcieszek, Michał; Kuśmirek, Wiktor; Pawełkowicz, Magdalena; PlÄ der, Wojciech; Nowak, Robert M.

    2017-08-01

    Genome sequencing is the core of genomic research. With the development of NGS and lowering the cost of procedure there is another tight gap - genome assembly. Developing the proper tool for this task is essential as quality of genome has important impact on further research. Here we present comparison of several de Bruijn assemblers tested on C. sativus genomic reads. The assessment shows that newly developed software - dnaasm provides better results in terms of quantity and quality. The number of generated sequences is lower by 5 - 33% with even two fold higher N50. Quality check showed reliable results were generated by dnaasm. This provides us with very strong base for future genomic analysis.

  10. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  11. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  12. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    Energy Technology Data Exchange (ETDEWEB)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques.

  13. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    International Nuclear Information System (INIS)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques

  14. LHCb: Statistical Comparison of CPU performance for LHCb applications on the Grid

    CERN Multimedia

    Graciani, R

    2009-01-01

    The usage of CPU resources by LHCb on the Grid id dominated by two different applications: Gauss and Brunel. Gauss the application doing the Monte Carlo simulation of proton-proton collisions. Brunel is the application responsible for the reconstruction of the signals recorded by the detector converting them into objects that can be used for later physics analysis of the data (tracks, clusters,…) Both applications are based on the Gaudi and LHCb software frameworks. Gauss uses Pythia and Geant as underlying libraries for the simulation of the collision and the later passage of the generated particles through the LHCb detector. While Brunel makes use of LHCb specific code to process the data from each sub-detector. Both applications are CPU bound. Large Monte Carlo productions or data reconstructions running on the Grid are an ideal benchmark to compare the performance of the different CPU models for each case. Since the processed events are only statistically comparable, only statistical comparison of the...

  15. Are Statistics Labs Worth the Effort?--Comparison of Introductory Statistics Courses Using Different Teaching Methods

    Directory of Open Access Journals (Sweden)

    Jose H. Guardiola

    2010-01-01

    Full Text Available This paper compares the academic performance of students in three similar elementary statistics courses taught by the same instructor, but with the lab component differing among the three. One course is traditionally taught without a lab component; the second with a lab component using scenarios and an extensive use of technology, but without explicit coordination between lab and lecture; and the third using a lab component with an extensive use of technology that carefully coordinates the lab with the lecture. Extensive use of technology means, in this context, using Minitab software in the lab section, doing homework and quizzes using MyMathlab ©, and emphasizing interpretation of computer output during lectures. Initially, an online instrument based on Gardner’s multiple intelligences theory, is given to students to try to identify students’ learning styles and intelligence types as covariates. An analysis of covariance is performed in order to compare differences in achievement. In this study there is no attempt to measure difference in student performance across the different treatments. The purpose of this study is to find indications of associations among variables that support the claim that statistics labs could be associated with superior academic achievement in one of these three instructional environments. Also, this study tries to identify individual student characteristics that could be associated with superior academic performance. This study did not find evidence of any individual student characteristics that could be associated with superior achievement. The response variable was computed as percentage of correct answers for the three exams during the semester added together. The results of this study indicate a significant difference across these three different instructional methods, showing significantly higher mean scores for the response variable on students taking the lab component that was carefully coordinated with

  16. Comparison of Statistical Methods for Detector Testing Programs

    Energy Technology Data Exchange (ETDEWEB)

    Rennie, John Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Abhold, Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    A typical goal for any detector testing program is to ascertain not only the performance of the detector systems under test, but also the confidence that systems accepted using that testing program’s acceptance criteria will exceed a minimum acceptable performance (which is usually expressed as the minimum acceptable success probability, p). A similar problem often arises in statistics, where we would like to ascertain the fraction, p, of a population of items that possess a property that may take one of two possible values. Typically, the problem is approached by drawing a fixed sample of size n, with the number of items out of n that possess the desired property, x, being termed successes. The sample mean gives an estimate of the population mean p ≈ x/n, although usually it is desirable to accompany such an estimate with a statement concerning the range within which p may fall and the confidence associated with that range. Procedures for establishing such ranges and confidence limits are described in detail by Clopper, Brown, and Agresti for two-sided symmetric confidence intervals.

  17. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  18. Statistical Monitoring of Innovation Capacities of the Serbian Firms as Decision- Making Tool

    Directory of Open Access Journals (Sweden)

    Marija Mosurović Ružičić

    2016-12-01

    Full Text Available The subject of this paper is to underline the importance of using data obtained via the official statistical reports that is based on Oslo manual methodology manual (Community Innovation Survey for strategic decision making both at the national level as well as at the level of the company. These data enable monitoring and evaluating the innovation capacity of the firms with the aim of improving it. The paper, also, points out the importance of the firm's innovation capacity assessment as an impeller of economic development based on knowledge. By the data obtained by presented methodology, national decision makers can clearly comprehend and improve the direction of innovation policy and its integration into the wider policy framework that encourage economic development based on innovation. At the firm level, the use of data implies development of professional management of the innovative firm that will be able to respond to problem situations of the modern economy through the formulation of appropriate strategies. The paper analyzed data from three statistical periods during which the Oslo manual methodology had been applied in Serbia. Analysis has shown that the data obtained in this way are not sufficiently used by decision-makers an occasion rating innovation capacity of enterprises.

  19. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  20. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. On the statistical comparison of climate model output and climate data

    International Nuclear Information System (INIS)

    Solow, A.R.

    1991-01-01

    Some broad issues arising in the statistical comparison of the output of climate models with the corresponding climate data are reviewed. Particular attention is paid to the question of detecting climate change. The purpose of this paper is to review some statistical approaches to the comparison of the output of climate models with climate data. There are many statistical issues arising in such a comparison. The author will focus on some of the broader issues, although some specific methodological questions will arise along the way. One important potential application of the approaches discussed in this paper is the detection of climate change. Although much of the discussion will be fairly general, he will try to point out the appropriate connections to the detection question. 9 refs

  4. On the statistical comparison of climate model output and climate data

    International Nuclear Information System (INIS)

    Solow, A.R.

    1990-01-01

    Some broad issues arising in the statistical comparison of the output of climate models with the corresponding climate data are reviewed. Particular attention is paid to the question of detecting climate change. The purpose of this paper is to review some statistical approaches to the comparison of the output of climate models with climate data. There are many statistical issues arising in such a comparison. The author will focus on some of the broader issues, although some specific methodological questions will arise along the way. One important potential application of the approaches discussed in this paper is the detection of climate change. Although much of the discussion will be fairly general, he will try to point out the appropriate connections to the detection question

  5. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  6. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  7. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. Statistical comparison of two or more SAGE libraries: one tag at a time

    NARCIS (Netherlands)

    Schaaf, Gerben J.; van Ruissen, Fred; van Kampen, Antoine; Kool, Marcel; Ruijter, Jan M.

    2008-01-01

    Several statistical tests have been introduced for the comparison of serial analysis of gene expression (SAGE) libraries to quantitatively analyze the differential expression of genes. As each SAGE library is only one measurement, the necessary information on biological variation or experimental

  9. Statistical evaluation of an interlaboratory comparison for the determination of uranium by potentiometric titration

    International Nuclear Information System (INIS)

    Ketema, D.J.; Harry, R.J.S.; Zijp, W.L.

    1990-09-01

    Upon request of the ESARDA working group 'Low enriched uranium conversion - and fuel fabrication plants' an interlaboratory comparison was organized, to assess the precision and accuracy concerning the determination of uranium by the potentiometric titration method. This report presents the results of a statistical evaluation on the data of the first phase of this exercise. (author). 9 refs.; 5 figs.; 24 tabs

  10. The Influence of Social Comparison and Peer Group Size on Risky Decision-Making.

    Science.gov (United States)

    Wang, Dawei; Zhu, Liping; Maguire, Phil; Liu, Yixin; Pang, Kaiyuan; Li, Zhenying; Hu, Yixin

    2016-01-01

    This study explores the influence of different social reference points and different comparison group sizes on risky decision-making. Participants were presented with a scenario describing an exam, and presented with the opportunity of making a risky decision in the context of different information provided about the performance of their peers. We found that behavior was influenced, not only by comparison with peers, but also by the size of the comparison group. Specifically, the larger the reference group, the more polarized the behavior it prompted. In situations describing social loss, participants were led to make riskier decisions after comparing themselves against larger groups, while in situations describing social gain, they become more risk averse. These results indicate that decision making is influenced both by social comparison and the number of people making up the social reference group.

  11. An economic and neuroscientific comparison of strategic decision making

    NARCIS (Netherlands)

    Michl, T.; Taing, S.; Day, M.; Stanton, A.; Welpe, I.M.

    2010-01-01

    Economic decision- making is traditionally based on the assumptions of the predominant existence of Homo economicus (Latin: economic human being). The framework of this theory is known as the Rational Choice Theory (RC Theory). It includes assumptions such as utility maximization, opportunism,

  12. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  13. Humans make efficient use of natural image statistics when performing spatial interpolation.

    Science.gov (United States)

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  14. A method for statistical comparison of data sets and its uses in analysis of nuclear physics data

    International Nuclear Information System (INIS)

    Bityukov, S.I.; Smirnova, V.V.; Krasnikov, N.V.; Maksimushkina, A.V.; Nikitenko, A.N.

    2014-01-01

    Authors propose a method for statistical comparison of two data sets. The method is based on the method of statistical comparison of histograms. As an estimator of quality of the decision made, it is proposed to use the value which it is possible to call the probability that the decision (data sets are various) is correct [ru

  15. Testing statistical significance scores of sequence comparison methods with structure similarity

    Directory of Open Access Journals (Sweden)

    Leunissen Jack AM

    2006-10-01

    Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.

  16. Comparisons between physics-based, engineering, and statistical learning models for outdoor sound propagation.

    Science.gov (United States)

    Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T

    2016-05-01

    Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.

  17. Rigor Mortis: Statistical thoroughness in reporting and the making of truth.

    Science.gov (United States)

    Tal, Aner

    2016-02-01

    Should a uniform checklist be adopted for methodological and statistical reporting? The current article discusses this notion, with particular attention to the use of old versus new statistics, and a consideration of the arguments brought up by Von Roten. The article argues that an overly exhaustive checklist that is uniformly applied to all submitted papers may be unsuitable for multidisciplinary work, and would further result in undue clutter and potentially distract reviewers from pertinent considerations in their evaluation of research articles. © The Author(s) 2015.

  18. Making Women Count: Gender-Typing, Technology and Path Dependencies in Dutch Statistical Data Processing

    NARCIS (Netherlands)

    van den Ende, Jan; van Oost, Elizabeth C.J.

    2001-01-01

    This article is a longitudinal analysis of the relation between gendered labour divisions and new data processing technologies at the Dutch Central Bureau of Statistics (CBS). Following social-constructivist and evolutionary economic approaches, the authors hold that the relation between technology

  19. The choice of statistical methods for comparisons of dosimetric data in radiotherapy.

    Science.gov (United States)

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-09-18

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman's test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman's rank and Kendall's rank tests. The Friedman's test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density-corrected methods as compared to the reference method. Spearman's and Kendall's rank tests indicated a positive correlation between the doses calculated with the different methods

  20. Difficult decisions: A qualitative exploration of the statistical decision making process from the perspectives of psychology students and academics

    Directory of Open Access Journals (Sweden)

    Peter James Allen

    2016-02-01

    Full Text Available Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these ‘experts’ were able to describe a far more systematic, comprehensive, flexible and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in

  1. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics.

    Science.gov (United States)

    Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

  2. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics

    Science.gov (United States)

    Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

  3. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  4. Implicit Statistical Learning in Real-World Environments Leads to Ecologically Rational Decision Making.

    Science.gov (United States)

    Perkovic, Sonja; Orquin, Jacob Lund

    2018-01-01

    Ecological rationality results from matching decision strategies to appropriate environmental structures, but how does the matching happen? We propose that people learn the statistical structure of the environment through observation and use this learned structure to guide ecologically rational behavior. We tested this hypothesis in the context of organic foods. In Study 1, we found that products from healthful food categories are more likely to be organic than products from nonhealthful food categories. In Study 2, we found that consumers' perceptions of the healthfulness and prevalence of organic products in many food categories are accurate. Finally, in Study 3, we found that people perceive organic products as more healthful than nonorganic products when the statistical structure justifies this inference. Our findings suggest that people believe organic foods are more healthful than nonorganic foods and use an organic-food cue to guide their behavior because organic foods are, on average, 30% more healthful.

  5. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    Science.gov (United States)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  6. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms

    International Nuclear Information System (INIS)

    Tang Jie; Nett, Brian E; Chen Guanghong

    2009-01-01

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  7. Financial Management and Control for Decision Making in Urban Local Bodies in India Using Statistical Techniques

    Science.gov (United States)

    Bhattacharyya, Sidhakam; Bandyopadhyay, Gautam

    2010-10-01

    The council of most of the Urban Local Bodies (ULBs) has a limited scope for decision making in the absence of appropriate financial control mechanism. The information about expected amount of own fund during a particular period is of great importance for decision making. Therefore, in this paper, efforts are being made to present set of findings and to establish a model of estimating receipts of own sources and payments thereof using multiple regression analysis. Data for sixty months from a reputed ULB in West Bengal have been considered for ascertaining the regression models. This can be used as a part of financial management and control procedure by the council to estimate the effect on own fund. In our study we have considered two models using multiple regression analysis. "Model I" comprises of total adjusted receipt as the dependent variable and selected individual receipts as the independent variables. Similarly "Model II" consists of total adjusted payments as the dependent variable and selected individual payments as independent variables. The resultant of Model I and Model II is the surplus or deficit effecting own fund. This may be applied for decision making purpose by the council.

  8. Effects of statistical models and items difficulties on making trait-level inferences: A simulation study

    Directory of Open Access Journals (Sweden)

    Nelson Hauck Filho

    2014-12-01

    Full Text Available Researchers dealing with the task of estimating locations of individuals on continuous latent variables may rely on several statistical models described in the literature. However, weighting costs and benefits of using one specific model over alternative models depends on empirical information that is not always clearly available. Therefore, the aim of this simulation study was to compare the performance of seven popular statistical models in providing adequate latent trait estimates in conditions of items difficulties targeted at the sample mean or at the tails of the latent trait distribution. Results suggested an overall tendency of models to provide more accurate estimates of true latent scores when using items targeted at the sample mean of the latent trait distribution. Rating Scale Model, Graded Response Model, and Weighted Least Squares Mean- and Variance-adjusted Confirmatory Factor Analysis yielded the most reliable latent trait estimates, even when applied to inadequate items for the sample distribution of the latent variable. These findings have important implications concerning some popular methodological practices in Psychology and related areas.

  9. Statistically derived factors of varied importance to audiologists when making a hearing aid brand preference decision.

    Science.gov (United States)

    Johnson, Earl E; Mueller, H Gustav; Ricketts, Todd A

    2009-01-01

    To determine the amount of importance audiologists place on various items related to their selection of a preferred hearing aid brand manufacturer. Three hundred forty-three hearing aid-dispensing audiologists rated a total of 32 randomized items by survey methodology. Principle component analysis identified seven orthogonal statistical factors of importance. In rank order, these factors were Aptitude of the Brand, Image, Cost, Sales and Speed of Delivery, Exposure, Colleague Recommendations, and Contracts and Incentives. While it was hypothesized that differences among audiologists in the importance ratings of these factors would dictate their preference for a given brand, that was not our finding. Specifically, mean ratings for the six most important factors did not differ among audiologists preferring different brands. A statistically significant difference among audiologists preferring different brands was present, however, for one factor: Contracts and Incentives. Its assigned importance, though, was always lower than that for the other six factors. Although most audiologists have a preferred hearing aid brand, differences in the perceived importance of common factors attributed to brands do not largely determine preference for a particular brand.

  10. On Extrapolating Past the Range of Observed Data When Making Statistical Predictions in Ecology.

    Directory of Open Access Journals (Sweden)

    Paul B Conn

    Full Text Available Ecologists are increasingly using statistical models to predict animal abundance and occurrence in unsampled locations. The reliability of such predictions depends on a number of factors, including sample size, how far prediction locations are from the observed data, and similarity of predictive covariates in locations where data are gathered to locations where predictions are desired. In this paper, we propose extending Cook's notion of an independent variable hull (IVH, developed originally for application with linear regression models, to generalized regression models as a way to help assess the potential reliability of predictions in unsampled areas. Predictions occurring inside the generalized independent variable hull (gIVH can be regarded as interpolations, while predictions occurring outside the gIVH can be regarded as extrapolations worthy of additional investigation or skepticism. We conduct a simulation study to demonstrate the usefulness of this metric for limiting the scope of spatial inference when conducting model-based abundance estimation from survey counts. In this case, limiting inference to the gIVH substantially reduces bias, especially when survey designs are spatially imbalanced. We also demonstrate the utility of the gIVH in diagnosing problematic extrapolations when estimating the relative abundance of ribbon seals in the Bering Sea as a function of predictive covariates. We suggest that ecologists routinely use diagnostics such as the gIVH to help gauge the reliability of predictions from statistical models (such as generalized linear, generalized additive, and spatio-temporal regression models.

  11. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    Science.gov (United States)

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  12. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    Science.gov (United States)

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    Science.gov (United States)

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  14. Food consumption and the actual statistics of cardiovascular diseases: an epidemiological comparison of 42 European countries

    Directory of Open Access Journals (Sweden)

    Pavel Grasgruber

    2016-09-01

    Full Text Available Background: The aim of this ecological study was to identify the main nutritional factors related to the prevalence of cardiovascular diseases (CVDs in Europe, based on a comparison of international statistics. Design: The mean consumption of 62 food items from the FAOSTAT database (1993–2008 was compared with the actual statistics of five CVD indicators in 42 European countries. Several other exogenous factors (health expenditure, smoking, body mass index and the historical stability of results were also examined. Results: We found exceptionally strong relationships between some of the examined factors, the highest being a correlation between raised cholesterol in men and the combined consumption of animal fat and animal protein (r=0.92, p<0.001. The most significant dietary correlate of low CVD risk was high total fat and animal protein consumption. Additional statistical analyses further highlighted citrus fruits, high-fat dairy (cheese and tree nuts. Among other non-dietary factors, health expenditure showed by far the highest correlation coefficients. The major correlate of high CVD risk was the proportion of energy from carbohydrates and alcohol, or from potato and cereal carbohydrates. Similar patterns were observed between food consumption and CVD statistics from the period 1980–2000, which shows that these relationships are stable over time. However, we found striking discrepancies in men's CVD statistics from 1980 and 1990, which can probably explain the origin of the ‘saturated fat hypothesis’ that influenced public health policies in the following decades. Conclusion: Our results do not support the association between CVDs and saturated fat, which is still contained in official dietary guidelines. Instead, they agree with data accumulated from recent studies that link CVD risk with the high glycaemic index/load of carbohydrate-based diets. In the absence of any scientific evidence connecting saturated fat with CVDs, these

  15. Social indicators and other income statistics using the EUROMOD baseline: a comparison with Eurostat and National Statistics

    OpenAIRE

    Mantovani, Daniela; Sutherland, Holly

    2003-01-01

    This paper reports an exercise to validate EUROMOD output for 1998 by comparing income statistics calculated from the baseline micro-output with comparable statistics from other sources, including the European Community Household Panel. The main potential reasons for discrepancies are identified. While there are some specific national issues that arise, there are two main general points to consider in interpreting EUROMOD estimates of social indicators across EU member States: (a) the method ...

  16. Statistical Network Analysis for Functional MRI: Mean Networks and Group Comparisons.

    Directory of Open Access Journals (Sweden)

    Cedric E Ginestet

    2014-05-01

    Full Text Available Comparing networks in neuroscience is hard, because the topological properties of a given network are necessarily dependent on the number of edges of that network. This problem arises in the analysis of both weighted and unweighted networks. The term density is often used in this context, in order to refer to the mean edge weight of a weighted network, or to the number of edges in an unweighted one. Comparing families of networks is therefore statistically difficult because differences in topology are necessarily associated with differences in density. In this review paper, we consider this problem from two different perspectives, which include (i the construction of summary networks, such as how to compute and visualize the mean network from a sample of network-valued data points; and (ii how to test for topological differences, when two families of networks also exhibit significant differences in density. In the first instance, we show that the issue of summarizing a family of networks can be conducted by either adopting a mass-univariate approach, which produces a statistical parametric network (SPN, or by directly computing the mean network, provided that a metric has been specified on the space of all networks with a given number of nodes. In the second part of this review, we then highlight the inherent problems associated with the comparison of topological functions of families of networks that differ in density. In particular, we show that a wide range of topological summaries, such as global efficiency and network modularity are highly sensitive to differences in density. Moreover, these problems are not restricted to unweighted metrics, as we demonstrate that the same issues remain present when considering the weighted versions of these metrics. We conclude by encouraging caution, when reporting such statistical comparisons, and by emphasizing the importance of constructing summary networks.

  17. Statistical analysis supporting decision-making about opening na university library on saturdays

    Directory of Open Access Journals (Sweden)

    Lisandra Maria Kovaliczn Nadal

    2017-06-01

    Full Text Available The concern with the welfare of employees and the significant reduction in the demand of books loans by postgraduate students on Saturdays led to a change in operating days at the information units of Central Library Professor Faris Michaele (BICEN, in State University of Ponta Grossa (UEPG, in Ponta Grossa, PR. Therefore, the study intended to support the decision of closing the university library on Saturdays in 2016. It was verified whether there is statistical significance in the relationship between the type of library user and the number of books borrowed on Saturdays, and whether the loan of books by postgraduate students was relevant compared to others. Based on the loan data between February 2014 and December 2015, it was determined that there is a significant relationship between the type of library user and the number of borrowed books, and that the loan of books by undergraduate students is the most relevant. Also considering the saving of resources such as light and overtime and the maintenance of compliance with the norms of the Ministry of Education (MEC for the approval of undergraduate courses, closing the units on Saturdays during the academic year of 2016 was the right decision.

  18. The choice of statistical methods for comparisons of dosimetric data in radiotherapy

    International Nuclear Information System (INIS)

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-01-01

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman’s test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman’s rank and Kendall’s rank tests. The Friedman’s test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p <0.001). The density correction methods yielded to lower doses as compared to PBC by on average (−5 ± 4.4 SD) for MB and (−4.7 ± 5 SD) for ETAR. Post-hoc Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density

  19. Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite

    Energy Technology Data Exchange (ETDEWEB)

    Mark C. Carroll; David T. Rohrbaugh

    2013-08-01

    High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

  20. StAR: a simple tool for the statistical comparison of ROC curves

    Directory of Open Access Journals (Sweden)

    Melo Francisco

    2008-06-01

    Full Text Available Abstract Background As in many different areas of science and technology, most important problems in bioinformatics rely on the proper development and assessment of binary classifiers. A generalized assessment of the performance of binary classifiers is typically carried out through the analysis of their receiver operating characteristic (ROC curves. The area under the ROC curve (AUC constitutes a popular indicator of the performance of a binary classifier. However, the assessment of the statistical significance of the difference between any two classifiers based on this measure is not a straightforward task, since not many freely available tools exist. Most existing software is either not free, difficult to use or not easy to automate when a comparative assessment of the performance of many binary classifiers is intended. This constitutes the typical scenario for the optimization of parameters when developing new classifiers and also for their performance validation through the comparison to previous art. Results In this work we describe and release new software to assess the statistical significance of the observed difference between the AUCs of any two classifiers for a common task estimated from paired data or unpaired balanced data. The software is able to perform a pairwise comparison of many classifiers in a single run, without requiring any expert or advanced knowledge to use it. The software relies on a non-parametric test for the difference of the AUCs that accounts for the correlation of the ROC curves. The results are displayed graphically and can be easily customized by the user. A human-readable report is generated and the complete data resulting from the analysis are also available for download, which can be used for further analysis with other software. The software is released as a web server that can be used in any client platform and also as a standalone application for the Linux operating system. Conclusion A new software for

  1. Making sense of war : Using the interpretation comparison model to understand the Iraq conflict

    NARCIS (Netherlands)

    Stapel, Diederik A.; Marx, David M.

    2007-01-01

    The current research addressed the issue of how people use the past to compare and interpret the present. Using the logic of the Interpretation Comparison Model (ICM) we examined two factors (distinctness of past events and ambiguity of target event) that may influence how people make sense of a

  2. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  3. Learning Statistics at the Farmers Market? A Comparison of Academic Service Learning and Case Studies in an Introductory Statistics Course

    Science.gov (United States)

    Hiedemann, Bridget; Jones, Stacey M.

    2010-01-01

    We compare the effectiveness of academic service learning to that of case studies in an undergraduate introductory business statistics course. Students in six sections of the course were assigned either an academic service learning project (ASL) or business case studies (CS). We examine two learning outcomes: students' performance on the final…

  4. Possible future changes in South East Australian frost frequency: an inter-comparison of statistical downscaling approaches

    Science.gov (United States)

    Crimp, Steven; Jin, Huidong; Kokic, Philip; Bakar, Shuvo; Nicholls, Neville

    2018-04-01

    Anthropogenic climate change has already been shown to effect the frequency, intensity, spatial extent, duration and seasonality of extreme climate events. Understanding these changes is an important step in determining exposure, vulnerability and focus for adaptation. In an attempt to support adaptation decision-making we have examined statistical modelling techniques to improve the representation of global climate model (GCM) derived projections of minimum temperature extremes (frosts) in Australia. We examine the spatial changes in minimum temperature extreme metrics (e.g. monthly and seasonal frost frequency etc.), for a region exhibiting the strongest station trends in Australia, and compare these changes with minimum temperature extreme metrics derived from 10 GCMs, from the Coupled Model Inter-comparison Project Phase 5 (CMIP 5) datasets, and via statistical downscaling. We compare the observed trends with those derived from the "raw" GCM minimum temperature data as well as examine whether quantile matching (QM) or spatio-temporal (spTimerQM) modelling with Quantile Matching can be used to improve the correlation between observed and simulated extreme minimum temperatures. We demonstrate, that the spTimerQM modelling approach provides correlations with observed daily minimum temperatures for the period August to November of 0.22. This represents an almost fourfold improvement over either the "raw" GCM or QM results. The spTimerQM modelling approach also improves correlations with observed monthly frost frequency statistics to 0.84 as opposed to 0.37 and 0.81 for the "raw" GCM and QM results respectively. We apply the spatio-temporal model to examine future extreme minimum temperature projections for the period 2016 to 2048. The spTimerQM modelling results suggest the persistence of current levels of frost risk out to 2030, with the evidence of continuing decadal variation.

  5. EPA/NMED/LANL 1998 water quality results: Statistical analysis and comparison to regulatory standards

    International Nuclear Information System (INIS)

    Gallaher, B.; Mercier, T.; Black, P.; Mullen, K.

    2000-01-01

    Four governmental agencies conducted a round of groundwater, surface water, and spring water sampling at the Los Alamos National Laboratory during 1998. Samples were split among the four parties and sent to independent analytical laboratories. Results from three of the agencies were available for this study. Comparisons of analytical results that were paired by location and date were made between the various analytical laboratories. The results for over 50 split samples analyzed for inorganic chemicals, metals, and radionuclides were compared. Statistical analyses included non-parametric (sign test and signed-ranks test) and parametric (paired t-test and linear regression) methods. The data pairs were tested for statistically significant differences, defined by an observed significance level, or p-value, less than 0.05. The main conclusion is that the laboratories' performances are similar across most of the analytes that were measured. In some 95% of the laboratory measurements there was agreement on whether contaminant levels exceeded regulatory limits. The most significant differences in performance were noted for the radioactive suite, particularly for gross alpha particle activity and Sr-90

  6. Comparison of long-term Moscow and Danish NLC observations: statistical results

    Directory of Open Access Journals (Sweden)

    P. Dalin

    2006-11-01

    Full Text Available Noctilucent clouds (NLC are the highest clouds in the Earth's atmosphere, observed close to the mesopause at 80–90 km altitudes. Systematic NLC observations conducted in Moscow for the period of 1962–2005 and in Denmark for 1983–2005 are compared and statistical results both for seasonally summarized NLC parameters and for individual NLC appearances are described. Careful attention is paid to the weather conditions during each season of observations. This turns out to be a very important factor both for the NLC case study and for long-term data set analysis. Time series of seasonal values show moderate similarity (taking into account the weather conditions but, at the same time, the comparison of individual cases of NLC occurrence reveals substantial differences. There are positive trends in the Moscow and Danish normalized NLC brightness as well as nearly zero trend in the Moscow normalized NLC occurrence frequency but these long-term changes are not statistically significant. The quasi-ten-year cycle in NLC parameters is about 1 year shorter than the solar cycle during the same period. The characteristic scale of NLC fields is estimated for the first time and it is found to be less than 800 km.

  7. Comparison of statistical sampling methods with ScannerBit, the GAMBIT scanning module

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); McKay, James; Scott, Pat [Imperial College London, Department of Physics, Blackett Laboratory, London (United Kingdom); Farmer, Ben; Conrad, Jan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Roebber, Elinore [McGill University, Department of Physics, Montreal, QC (Canada); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Collaboration: The GAMBIT Scanner Workgroup

    2017-11-15

    We introduce ScannerBit, the statistics and sampling module of the public, open-source global fitting framework GAMBIT. ScannerBit provides a standardised interface to different sampling algorithms, enabling the use and comparison of multiple computational methods for inferring profile likelihoods, Bayesian posteriors, and other statistical quantities. The current version offers random, grid, raster, nested sampling, differential evolution, Markov Chain Monte Carlo (MCMC) and ensemble Monte Carlo samplers. We also announce the release of a new standalone differential evolution sampler, Diver, and describe its design, usage and interface to ScannerBit. We subject Diver and three other samplers (the nested sampler MultiNest, the MCMC GreAT, and the native ScannerBit implementation of the ensemble Monte Carlo algorithm T-Walk) to a battery of statistical tests. For this we use a realistic physical likelihood function, based on the scalar singlet model of dark matter. We examine the performance of each sampler as a function of its adjustable settings, and the dimensionality of the sampling problem. We evaluate performance on four metrics: optimality of the best fit found, completeness in exploring the best-fit region, number of likelihood evaluations, and total runtime. For Bayesian posterior estimation at high resolution, T-Walk provides the most accurate and timely mapping of the full parameter space. For profile likelihood analysis in less than about ten dimensions, we find that Diver and MultiNest score similarly in terms of best fit and speed, outperforming GreAT and T-Walk; in ten or more dimensions, Diver substantially outperforms the other three samplers on all metrics. (orig.)

  8. Consumer Decision-Making Styles Extension to Trust-Based Product Comparison Site Usage Model

    Directory of Open Access Journals (Sweden)

    Radoslaw Macik

    2016-09-01

    Full Text Available The paper describes an implementation of extended consumer decision-making styles concept in explaining consumer choices made in product comparison site environment in the context of trust-based information technology acceptance model. Previous research proved that trust-based acceptance model is useful in explaining purchase intention and anticipated satisfaction in product comparison site environment, as an example of online decision shopping aids. Trust to such aids is important in explaining their usage by consumers. The connections between consumer decision-making styles, product and sellers opinions usage, cognitive and affective trust toward online product comparison site, as well as choice outcomes (purchase intention and brand choice are explored trough structural equation models using PLS-SEM approach, using a sample of 461 young consumers. Research confirmed the validity of research model in explaining product comparison usage, and some consumer decision-making styles influenced consumers’ choices and purchase intention. Product and sellers reviews usage were partially mediating mentioned relationships.

  9. Importance of risk comparison for individual and societal decision-making after the Fukushima disaster.

    Science.gov (United States)

    Murakami, Michio

    2018-01-30

    Risk comparison is essential for effective societal and individual decision-making. After the Fukushima disaster, studies compared radiation and other disaster-related risks to determine the effective prioritizing of measures for response. Evaluating the value of risk comparison information can enable effective risk communication. In this review, the value of risk comparison after the Fukushima disaster for societal and individual decision-making is discussed while clarifying the concept of radiation risk assessment at low doses. The objectives of radiation risk assessment are explained within a regulatory science framework, including the historical adoption of the linear non-threshold theory. An example of risk comparison (i.e. radiation risk versus evacuation-related risk in nursing homes) is used to discuss the prioritization of pre-disaster measures. The effective communication of risk information by authorities is discussed with respect to group-based and face-to-face approaches. Furthermore, future perspectives regarding radiation risk comparisons are discussed. © The Author(s) 2018. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  10. Social rank and social cooperation: Impact of social comparison processes on cooperative decision-making.

    Directory of Open Access Journals (Sweden)

    Xu Gong

    Full Text Available Successful navigation of our complex social world requires the capability to recognize and judge the relative status of others. Hence, social comparison processes are of great importance in our interactions, informing us of our relative standing and in turn potentially motivating our behavior. However, so far few studies have examined in detail how social comparison can influence interpersonal decision-making. One aspect of social decision-making that is of particular importance is cooperative behavior, and identifying means of maintaining and promoting cooperation in the provision of public goods is of vital interest to society. Here, we manipulated social comparison by grading performance rankings on a reaction time task, and then measured cooperative decisions via a modified Public Goods Game (PGG. Findings revealed that individuals ranked highest tended to be more cooperative as compared to those who placed in the bottom rank. Interestingly, this effect was regardless of whether the comparison group members were the subsequent players in the PGG or not, and this effect was stronger in those with higher social orientation. In summary, the present research shows how different social comparison processes (assessed via social rankings can operate in our daily interaction with others, demonstrating an important effect on cooperative behavior.

  11. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  12. The Comparison of Risky Decision Making in Opium Abuser and Healthy Matched Individuals

    Directory of Open Access Journals (Sweden)

    Vahid Nejati

    2013-07-01

    Full Text Available Objective: Risky decision making is one of the most basic mechanisms of impulsive and addictive behaviors. The purpose of present study was the comparison of risky decision making in opium abuser and healthy matched individuals. Method: In present cross sectional study, 50 opium abusers compared to 50 healthy who were matched on age and gender. Balloon Analogue Risk Taking Task was used for evaluation of risk taking in participant of both groups. Results: The results showed that opium abusers have had higher scores on number of plumbing balloon and exploded balloon in BART task than normal individuals. Conclusion: Opium abusers have higher risk taking than normal individuals.

  13. Comparison of four statistical and machine learning methods for crash severity prediction.

    Science.gov (United States)

    Iranitalab, Amirfarrokh; Khattak, Aemal

    2017-11-01

    Crash severity prediction models enable different agencies to predict the severity of a reported crash with unknown severity or the severity of crashes that may be expected to occur sometime in the future. This paper had three main objectives: comparison of the performance of four statistical and machine learning methods including Multinomial Logit (MNL), Nearest Neighbor Classification (NNC), Support Vector Machines (SVM) and Random Forests (RF), in predicting traffic crash severity; developing a crash costs-based approach for comparison of crash severity prediction methods; and investigating the effects of data clustering methods comprising K-means Clustering (KC) and Latent Class Clustering (LCC), on the performance of crash severity prediction models. The 2012-2015 reported crash data from Nebraska, United States was obtained and two-vehicle crashes were extracted as the analysis data. The dataset was split into training/estimation (2012-2014) and validation (2015) subsets. The four prediction methods were trained/estimated using the training/estimation dataset and the correct prediction rates for each crash severity level, overall correct prediction rate and a proposed crash costs-based accuracy measure were obtained for the validation dataset. The correct prediction rates and the proposed approach showed NNC had the best prediction performance in overall and in more severe crashes. RF and SVM had the next two sufficient performances and MNL was the weakest method. Data clustering did not affect the prediction results of SVM, but KC improved the prediction performance of MNL, NNC and RF, while LCC caused improvement in MNL and RF but weakened the performance of NNC. Overall correct prediction rate had almost the exact opposite results compared to the proposed approach, showing that neglecting the crash costs can lead to misjudgment in choosing the right prediction method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. The use of statistical procedures for comparison of the production of eggs considering different treatments and lines of quails

    Directory of Open Access Journals (Sweden)

    Robson Marcelo Rossi

    2012-04-01

    Full Text Available To make inferences concerning a population, parametric methods are traditionally used for sample data analysis assuming that these come from ordinary populations, which not always happens. Most of the times such assumption leads to inadequate conclusions and decisions, especially when the data are distributed asymmetrically. It was aimed on this work to adjust some plausible models for this type of data, aiming to compare eggs productions in quails of different lines through frequentist and Bayesian methods. In the analyzed data it could be verified that the parametric and non-parametric frequentist procedures were, in general, equally conclusive. The Bayesian procedure was the only one which detected difference between the yellow and red lines, regarding the preconized energy diet, and between the blue and yellow for the second eclosion group. By the frequentist methods differences between the blue and red lines were not found; however, given a Poisson distribution for the eggs production, through multiple comparisons between the posterior mean under the Bayesian focus, differences were found between all lines. For being more sensitive, besides flexible, such procedure detected differences not observed by the traditional methods between the yellow and red lines, in the higher energy diet; between the blue and the red one in the second eclosion group and in general. It was concluded that quails of the yellow line showed bigger eggs production, regardless of the eclosion group and diet. Such results raise the discussion on the appropriate use of methods and procedures for statistical data analysis.

  15. Comparison of base flows to selected streamflow statistics representative of 1930-2002 in West Virginia

    Science.gov (United States)

    Wiley, Jeffrey B.

    2012-01-01

    Base flows were compared with published streamflow statistics to assess climate variability and to determine the published statistics that can be substituted for annual and seasonal base flows of unregulated streams in West Virginia. The comparison study was done by the U.S. Geological Survey, in cooperation with the West Virginia Department of Environmental Protection, Division of Water and Waste Management. The seasons were defined as winter (January 1-March 31), spring (April 1-June 30), summer (July 1-September 30), and fall (October 1-December 31). Differences in mean annual base flows for five record sub-periods (1930-42, 1943-62, 1963-69, 1970-79, and 1980-2002) range from -14.9 to 14.6 percent when compared to the values for the period 1930-2002. Differences between mean seasonal base flows and values for the period 1930-2002 are less variable for winter and spring, -11.2 to 11.0 percent, than for summer and fall, -47.0 to 43.6 percent. Mean summer base flows (July-September) and mean monthly base flows for July, August, September, and October are approximately equal, within 7.4 percentage points of mean annual base flow. The mean of each of annual, spring, summer, fall, and winter base flows are approximately equal to the annual 50-percent (standard error of 10.3 percent), 45-percent (error of 14.6 percent), 75-percent (error of 11.8 percent), 55-percent (error of 11.2 percent), and 35-percent duration flows (error of 11.1 percent), respectively. The mean seasonal base flows for spring, summer, fall, and winter are approximately equal to the spring 50- to 55-percent (standard error of 6.8 percent), summer 45- to 50-percent (error of 6.7 percent), fall 45-percent (error of 15.2 percent), and winter 60-percent duration flows (error of 8.5 percent), respectively. Annual and seasonal base flows representative of the period 1930-2002 at unregulated streamflow-gaging stations and ungaged locations in West Virginia can be estimated using previously published

  16. Improvement of vertical velocity statistics measured by a Doppler lidar through comparison with sonic anemometer observations

    Science.gov (United States)

    Bonin, Timothy A.; Newman, Jennifer F.; Klein, Petra M.; Chilson, Phillip B.; Wharton, Sonia

    2016-12-01

    Since turbulence measurements from Doppler lidars are being increasingly used within wind energy and boundary-layer meteorology, it is important to assess and improve the accuracy of these observations. While turbulent quantities are measured by Doppler lidars in several different ways, the simplest and most frequently used statistic is vertical velocity variance (w'2) from zenith stares. However, the competing effects of signal noise and resolution volume limitations, which respectively increase and decrease w'2, reduce the accuracy of these measurements. Herein, an established method that utilises the autocovariance of the signal to remove noise is evaluated and its skill in correcting for volume-averaging effects in the calculation of w'2 is also assessed. Additionally, this autocovariance technique is further refined by defining the amount of lag time to use for the most accurate estimates of w'2. Through comparison of observations from two Doppler lidars and sonic anemometers on a 300 m tower, the autocovariance technique is shown to generally improve estimates of w'2. After the autocovariance technique is applied, values of w'2 from the Doppler lidars are generally in close agreement (R2 ≈ 0.95 - 0.98) with those calculated from sonic anemometer measurements.

  17. Scaling images using their background ratio. An application in statistical comparisons of images

    International Nuclear Information System (INIS)

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-01-01

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases

  18. Estimates of statistical significance for comparison of individual positions in multiple sequence alignments

    Directory of Open Access Journals (Sweden)

    Sadreyev Ruslan I

    2004-08-01

    Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.

  19. Scaling images using their background ratio. An application in statistical comparisons of images.

    Science.gov (United States)

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-06-07

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.

  20. Make

    CERN Document Server

    Frauenfelder, Mark

    2012-01-01

    The first magazine devoted entirely to do-it-yourself technology projects presents its 29th quarterly edition for people who like to tweak, disassemble, recreate, and invent cool new uses for technology. MAKE Volume 29 takes bio-hacking to a new level. Get introduced to DIY tracking devices before they hit the consumer electronics marketplace. Learn how to build an EKG machine to study your heartbeat, and put together a DIY bio lab to study athletic motion using consumer grade hardware.

  1. Renormalization group theory outperforms other approaches in statistical comparison between upscaling techniques for porous media

    Science.gov (United States)

    Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.

    2017-09-01

    Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.

  2. Comparison of different statistical modelling approaches for deriving spatial air temperature patterns in an urban environment

    Science.gov (United States)

    Straub, Annette; Beck, Christoph; Breitner, Susanne; Cyrys, Josef; Geruschkat, Uta; Jacobeit, Jucundus; Kühlbach, Benjamin; Kusch, Thomas; Richter, Katja; Schneider, Alexandra; Umminger, Robin; Wolf, Kathrin

    2017-04-01

    Frequently spatial variations of air temperature of considerable magnitude occur within urban areas. They correspond to varying land use/land cover characteristics and vary with season, time of day and synoptic conditions. These temperature differences have an impact on human health and comfort directly by inducing thermal stress as well as indirectly by means of affecting air quality. Therefore, knowledge of the spatial patterns of air temperature in cities and the factors causing them is of great importance, e.g. for urban planners. A multitude of studies have shown statistical modelling to be a suitable tool for generating spatial air temperature patterns. This contribution presents a comparison of different statistical modelling approaches for deriving spatial air temperature patterns in the urban environment of Augsburg, Southern Germany. In Augsburg there exists a measurement network for air temperature and humidity currently comprising 48 stations in the city and its rural surroundings (corporately operated by the Institute of Epidemiology II, Helmholtz Zentrum München, German Research Center for Environmental Health and the Institute of Geography, University of Augsburg). Using different datasets for land surface characteristics (Open Street Map, Urban Atlas) area percentages of different types of land cover were calculated for quadratic buffer zones of different size (25, 50, 100, 250, 500 m) around the stations as well for source regions of advective air flow and used as predictors together with additional variables such as sky view factor, ground level and distance from the city centre. Multiple Linear Regression and Random Forest models for different situations taking into account season, time of day and weather condition were applied utilizing selected subsets of these predictors in order to model spatial distributions of mean hourly and daily air temperature deviations from a rural reference station. Furthermore, the different model setups were

  3. Five year prediction of Sea Surface Temperature in the Tropical Atlantic: a comparison of simple statistical methods

    OpenAIRE

    Laepple, Thomas; Jewson, Stephen; Meagher, Jonathan; O'Shay, Adam; Penzer, Jeremy

    2007-01-01

    We are developing schemes that predict future hurricane numbers by first predicting future sea surface temperatures (SSTs), and then apply the observed statistical relationship between SST and hurricane numbers. As part of this overall goal, in this study we compare the historical performance of three simple statistical methods for making five-year SST forecasts. We also present SST forecasts for 2006-2010 using these methods and compare them to forecasts made from two structural time series ...

  4. Risk patterns and correlated brain activities. Multidimensional statistical analysis of FMRI data in economic decision making study.

    Science.gov (United States)

    van Bömmel, Alena; Song, Song; Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K

    2014-07-01

    Decision making usually involves uncertainty and risk. Understanding which parts of the human brain are activated during decisions under risk and which neural processes underly (risky) investment decisions are important goals in neuroeconomics. Here, we analyze functional magnetic resonance imaging (fMRI) data on 17 subjects who were exposed to an investment decision task from Mohr, Biele, Krugel, Li, and Heekeren (in NeuroImage 49, 2556-2563, 2010b). We obtain a time series of three-dimensional images of the blood-oxygen-level dependent (BOLD) fMRI signals. We apply a panel version of the dynamic semiparametric factor model (DSFM) presented in Park, Mammen, Wolfgang, and Borak (in Journal of the American Statistical Association 104(485), 284-298, 2009) and identify task-related activations in space and dynamics in time. With the panel DSFM (PDSFM) we can capture the dynamic behavior of the specific brain regions common for all subjects and represent the high-dimensional time-series data in easily interpretable low-dimensional dynamic factors without large loss of variability. Further, we classify the risk attitudes of all subjects based on the estimated low-dimensional time series. Our classification analysis successfully confirms the estimated risk attitudes derived directly from subjects' decision behavior.

  5. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  6. A comparison of two methods of logMAR visual acuity data scoring for statistical analysis

    Directory of Open Access Journals (Sweden)

    O. A. Oduntan

    2009-12-01

    Full Text Available The purpose of this study was to compare two methods of logMAR visual acuity (VA scoring. The two methods are referred to as letter scoring (method 1 and line scoring (method 2. The two methods were applied to VA data obtained from one hundred and forty (N=140 children with oculocutaneous albinism. Descriptive, correlation andregression statistics were then used to analyze the data.  Also, where applicable, the Bland and Altman analysis was used to compare sets of data from the two methods.  The right and left eyes data were included in the study, but because the findings were similar in both eyes, only the results for the right eyes are presented in this paper.  For method 1, the mean unaided VA (mean UAOD1 = 0.39 ±0.15 logMAR. The mean aided (mean ADOD1 VA = 0.50 ± 0.16 logMAR.  For method 2, the mean unaided (mean UAOD2 VA = 0.71 ± 0.15 logMAR, while the mean aided VA (mean ADOD2 = 0.60 ± 0.16 logMAR. The range and mean values of the improvement in VA for both methods were the same. The unaided VAs (UAOD1, UAOD2 and aided (ADOD1, ADOD2 for methods 1 and 2 correlated negatively (Unaided, r = –1, p<0.05, (Aided, r = –1, p<0.05.  The improvement in VA (differences between the unaided and aided VA values (DOD1 and DOD2 were positively correlated (r = +1, p <0.05. The Bland and Altman analyses showed that the VA improvement (unaided – aided VA values (DOD1 and DOD2 were similar for the two methods. Findings indicated that only the improvement in VA could be compared when different scoring methods are used. Therefore the scoring method used in any VA research project should be stated in the publication so that appropriate comparisons could be made by other researchers.

  7. Traditional Lecture Versus an Activity Approach for Teaching Statistics: A Comparison of Outcomes

    OpenAIRE

    Loveland, Jennifer L.

    2014-01-01

    Many educational researchers have proposed teaching statistics with less lecture and more active learning methods. However, there are only a few comparative studies that have taught one section of statistics with lectures and one section with activity-based methods; of those studies, the results are contradictory. To address the need for more research on the actual effectiveness of active learning methods in introductory statistics, this research study was undertaken. An introductory, univ...

  8. A weighted generalized score statistic for comparison of predictive values of diagnostic tests.

    Science.gov (United States)

    Kosinski, Andrzej S

    2013-03-15

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.

  9. OECD eXplorer: Making Regional Statistics Come Alive through a Geo-Visual Web-Tool

    Directory of Open Access Journals (Sweden)

    Monica Brezzi

    2011-06-01

    Full Text Available Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing highly interactive Geovisual Analytics applications for the Internet. An emerging and challenging application domain is geovisualization of regional (sub-national statistics. Higher integration drivenby institutional processes and economic globalisation is eroding national borders and creating competition along regional lines in the world market. Sound information at sub-national level and benchmark of regions across borders have gained importance in the policy agenda of many countries. In this paper, we introduce “OECD eXplorer” — an interactive tool for analyzing and communicating gained insights and discoveries about spatial-temporal and multivariate OECD regional data. This database is a potential treasure chest for policy-makers, researchers and citizens to gain a better understanding of a region’s structure and performance and to carry out analysis of territorial trends and disparities based on sound information comparableacross countries. Many approaches and tools have been developed in spatial-related knowledge discovery but generally they do not scale well with dynamic visualization of larger spatial data on the Internet. In this context, we introduce a web-compliant Geovisual Analytics toolkit that supports a broad collection offunctional components for analysis, hypothesis generation and validation. The same tool enables the communicationof results on the basis of a snapshot mechanism that captures, re-uses and shares task-related explorative findings. Further developments underway are in the creation of a generic highly interactive web “eXplorer” platform that can be the foundation for easy customization of similar web applications usingdifferent geographical boundaries and indicators. Given this global dimension, a “generic eXplorer” will be a powerful tool to explore different territorial dimensions

  10. A comparison of test statistics for the recovery of rapid growth-based enumeration tests

    NARCIS (Netherlands)

    van den Heuvel, Edwin R.; IJzerman-Boon, Pieta C.

    This paper considers five test statistics for comparing the recovery of a rapid growth-based enumeration test with respect to the compendial microbiological method using a specific nonserial dilution experiment. The finite sample distributions of these test statistics are unknown, because they are

  11. Comparison of multiple-criteria decision-making methods - results of simulation study

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2016-12-01

    Full Text Available Background: Today, both researchers and practitioners have many methods for supporting the decision-making process. Due to the conditions in which supply chains function, the most interesting are multi-criteria methods. The use of sophisticated methods for supporting decisions requires the parameterization and execution of calculations that are often complex. So is it efficient to use sophisticated methods? Methods: The authors of the publication compared two popular multi-criteria decision-making methods: the  Weighted Sum Model (WSM and the Analytic Hierarchy Process (AHP. A simulation study reflects these two decision-making methods. Input data for this study was a set of criteria weights and the value of each in terms of each criterion. Results: The iGrafx Process for Six Sigma simulation software recreated how both multiple-criteria decision-making methods (WSM and AHP function. The result of the simulation was a numerical value defining the preference of each of the alternatives according to the WSM and AHP methods. The alternative producing a result of higher numerical value  was considered preferred, according to the selected method. In the analysis of the results, the relationship between the values of the parameters and the difference in the results presented by both methods was investigated. Statistical methods, including hypothesis testing, were used for this purpose. Conclusions: The simulation study findings prove that the results obtained with the use of two multiple-criteria decision-making methods are very similar. Differences occurred more frequently in lower-value parameters from the "value of each alternative" group and higher-value parameters from the "weight of criteria" group.

  12. Markov model plus k-word distributions: a synergy that produces novel statistical measures for sequence comparison.

    Science.gov (United States)

    Dai, Qi; Yang, Yanchun; Wang, Tianming

    2008-10-15

    Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.

  13. Comparison between statistical and optimization methods in accessing unmixing of spectrally similar materials

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2010-11-01

    Full Text Available This paper reports on the results from ordinary least squares and ridge regression as statistical methods, and is compared to numerical optimization methods such as the stochastic method for global optimization, simulated annealing, particle swarm...

  14. An audit of the statistics and the comparison with the parameter in the population

    Science.gov (United States)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  15. A randomized comparison between league tables and funnel plots to inform health care decision-making.

    Science.gov (United States)

    Anell, Anders; Hagberg, Oskar; Liedberg, Fredrik; Ryden, Stefan

    2016-12-01

    Comparison of provider performance is commonly used to inform health care decision-making. Little attention has been paid to how data presentations influence decisions. This study analyzes differences in suggested actions by decision-makers informed by league tables or funnel plots. Decision-makers were invited to a survey and randomized to compare hospital performance using either league tables or funnel plots for four different measures within the area of cancer care. For each measure, decision-makers were asked to suggest actions towards 12-16 hospitals (no action, ask for more information, intervene) and provide feedback related to whether the information provided had been useful. Swedish health care. Two hundred and twenty-one decision-makers at administrative and clinical levels. Data presentations in the form of league tables or funnel plots. Number of actions suggested by participants. Proportion of appropriate actions. For all four measures, decision-makers tended to suggest more actions based on the information provided in league tables compared to funnel plots (44% vs. 21%, P decision-makers more often missed to react even when appropriate. The form of data presentation had an influence on decision-making. With league tables, decision-makers tended to suggest more actions compared to funnel plots. A difference in sensitivity and specificity conditioned by the form of presentation could also be identified, with different implications depending on the purpose of comparisons. Explanations and visualization aids are needed to support appropriate actions. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  16. Comparison of small n statistical tests of differential expression applied to microarrays

    Directory of Open Access Journals (Sweden)

    Lee Anna Y

    2009-02-01

    Full Text Available Abstract Background DNA microarrays provide data for genome wide patterns of expression between observation classes. Microarray studies often have small samples sizes, however, due to cost constraints or specimen availability. This can lead to poor random error estimates and inaccurate statistical tests of differential expression. We compare the performance of the standard t-test, fold change, and four small n statistical test methods designed to circumvent these problems. We report results of various normalization methods for empirical microarray data and of various random error models for simulated data. Results Three Empirical Bayes methods (CyberT, BRB, and limma t-statistics were the most effective statistical tests across simulated and both 2-colour cDNA and Affymetrix experimental data. The CyberT regularized t-statistic in particular was able to maintain expected false positive rates with simulated data showing high variances at low gene intensities, although at the cost of low true positive rates. The Local Pooled Error (LPE test introduced a bias that lowered false positive rates below theoretically expected values and had lower power relative to the top performers. The standard two-sample t-test and fold change were also found to be sub-optimal for detecting differentially expressed genes. The generalized log transformation was shown to be beneficial in improving results with certain data sets, in particular high variance cDNA data. Conclusion Pre-processing of data influences performance and the proper combination of pre-processing and statistical testing is necessary for obtaining the best results. All three Empirical Bayes methods assessed in our study are good choices for statistical tests for small n microarray studies for both Affymetrix and cDNA data. Choice of method for a particular study will depend on software and normalization preferences.

  17. Comparison of pure and 'Latinized' centroidal Voronoi tessellation against various other statistical sampling methods

    International Nuclear Information System (INIS)

    Romero, Vicente J.; Burkardt, John V.; Gunzburger, Max D.; Peterson, Janet S.

    2006-01-01

    A recently developed centroidal Voronoi tessellation (CVT) sampling method is investigated here to assess its suitability for use in statistical sampling applications. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-dimensional parameter spaces. On several 2-D test problems CVT has recently been found to provide exceedingly effective and efficient point distributions for response surface generation. Additionally, for statistical function integration and estimation of response statistics associated with uniformly distributed random-variable inputs (uncorrelated), CVT has been found in initial investigations to provide superior points sets when compared against latin-hypercube and simple-random Monte Carlo methods and Halton and Hammersley quasi-random sequence methods. In this paper, the performance of all these sampling methods and a new variant ('Latinized' CVT) are further compared for non-uniform input distributions. Specifically, given uncorrelated normal inputs in a 2-D test problem, statistical sampling efficiencies are compared for resolving various statistics of response: mean, variance, and exceedence probabilities

  18. Comprehensive analysis of tornado statistics in comparison to earthquakes: intensity and temporal behaviour

    Directory of Open Access Journals (Sweden)

    L. Schielicke

    2013-01-01

    Full Text Available Tornadoes and earthquakes are characterised by a high variability in their properties concerning intensity, geometric properties and temporal behaviour. Earthquakes are known for power-law behaviour in their intensity (Gutenberg–Richter law and temporal statistics (e.g. Omori law and interevent waiting times. The observed similarity of high variability of these two phenomena motivated us to compare the statistical behaviour of tornadoes using seismological methods and quest for power-law behaviour. In general, the statistics of tornadoes show power-law behaviour partly coextensive with characteristic scales when the temporal resolution is high (10 to 60 min. These characteristic scales match with the typical diurnal behaviour of tornadoes, which is characterised by a maximum of tornado occurrences in the late afternoon hours. Furthermore, the distributions support the observation that tornadoes cluster in time. Finally, we shortly discuss a possible similar underlying structure composed of heterogeneous, coupled, interactive threshold oscillators that possibly explains the observed behaviour.

  19. Comparison of different insulin pump makes under routine care conditions in adults with Type 1 diabetes.

    Science.gov (United States)

    Leelarathna, L; Roberts, S A; Hindle, A; Markakis, K; Alam, T; Chapman, A; Morris, J; Urwin, A; Jinadev, P; Rutter, M K

    2017-10-01

    To compare long-term HbA 1c changes associated with different insulin pumps during routine care in a large cohort of adults with Type 1 diabetes representative of other clinic populations. Observational, retrospective study of 508 individuals starting pump therapy between 1999 and 2014 (mean age, 40 years; 55% women; diabetes duration, 20 years; 94% Type 1 diabetes; median follow-up, 3.7 years). Mixed linear models compared covariate-adjusted HbA 1c changes associated with different pump makes. The pumps compared were: 50% Medtronic, 24% Omnipod, 14% Roche and 12% Animas. Overall HbA 1c levels improved and improvements were maintained during a follow-up extending to 10 years (HbA 1c : pre-continuous subcutaneous insulin infusion (pre-CSII) vs. 12 months post CSII, 71 (61, 82) vs. 66 (56, 74) mmol/mol; 8.7 (7.7, 9.6) vs. 8.2 (7.3, 8.9)%; P < 0.0001). The percentage of individuals with HbA 1c ≥ 64 mmol/mol (8.0%) reduced from a pre-CSII level of 68% to 55%. After adjusting for baseline confounders, there were no between-pump differences in HbA 1c lowering (P = 0.44), including a comparison of patch pumps with traditional catheter pumps (P = 0.63). There were no significant (P < 0.05) between-pump differences in HbA 1c lowering in pre-specified subgroups stratified by pre-pump HbA 1c , age or diabetes duration. HbA 1c lowering was positively related to baseline HbA 1c (P < 0.001) and diabetes duration (P = 0.017), and negatively related to the number of years of CSII use (P = 0.024). Under routine care conditions, there were no covariate-adjusted differences in HbA 1c lowering when comparing different pump makes, including a comparison of patch pumps vs. traditional catheter pumps. Therefore, the choice of CSII make should not be influenced by the desired degree of HbA 1c lowering. © 2017 Diabetes UK.

  20. Business Statistics: A Comparison of Student Performance in Three Learning Modes

    Science.gov (United States)

    Simmons, Gerald R.

    2014-01-01

    The purpose of this study was to compare the performance of three teaching modes and age groups of business statistics sections in terms of course exam scores. The research questions were formulated to determine the performance of the students within each teaching mode, to compare each mode in terms of exam scores, and to compare exam scores by…

  1. Network-based statistical comparison of citation topology of bibliographic databases

    Science.gov (United States)

    Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko

    2014-01-01

    Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231

  2. Comparison of precipitation nowcasting by extrapolation and statistical-advection methods

    Czech Academy of Sciences Publication Activity Database

    Sokol, Zbyněk; Kitzmiller, D.; Pešice, Petr; Mejsnar, Jan

    2013-01-01

    Roč. 123, 1 April (2013), s. 17-30 ISSN 0169-8095 R&D Projects: GA MŠk ME09033 Institutional support: RVO:68378289 Keywords : Precipitation forecast * Statistical models * Regression * Quantitative precipitation forecast * Extrapolation forecast Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.421, year: 2013 http://www.sciencedirect.com/science/article/pii/S0169809512003390

  3. Trends in violent crime: a comparison between police statistics and victimization surveys

    NARCIS (Netherlands)

    Wittebrood, Karin; Junger, Marianne

    2002-01-01

    Usually, two measures are used to describetrends in violent crime: police statistics andvictimization surveys. Both are available inthe Netherlands. In this contribution, we willfirst provide a description of the trends inviolent crime. It appears that both types ofstatistics reflect a different

  4. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    Science.gov (United States)

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  5. A Comparison of Several Statistical Tests of Reciprocity of Self-Disclosure.

    Science.gov (United States)

    Dindia, Kathryn

    1988-01-01

    Reports the results of a study that used several statistical tests of reciprocity of self-disclosure. Finds little evidence for reciprocity of self-disclosure, and concludes that either reciprocity is an illusion, or that different or more sophisticated methods are needed to detect it. (MS)

  6. Safe drug use in long QT syndrome and Brugada syndrome: comparison of website statistics

    NARCIS (Netherlands)

    Postema, Pieter G.; Neville, Jon; de Jong, Jonas S. S. G.; Romero, Klaus; Wilde, Arthur A. M.; Woosley, Raymond L.

    2013-01-01

    We sought to obtain insights into the efficacy of two websites, www.QTdrugs.org and www.BrugadaDrugs.org, that have the intention to prevent fatal arrhythmias due to unsafe drug use in Long QT syndrome and Brugada syndrome. Prospective web-use statistical analysis combined with online surveys were

  7. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  8. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    Science.gov (United States)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  9. Metabolome Comparison of Transgenic and Non-transgenic Rice by Statistical Analysis of FTIR and NMR Spectra

    Directory of Open Access Journals (Sweden)

    Keykhosrow Keymanesh

    2009-06-01

    Full Text Available Modern biotechnology, based on recombinant DNA techniques, has made it possible to introduce new traits with great potential for crop improvement. However, concerns about unintended effects of gene transformation that possibly threaten environment or consumer health have persuaded scientists to set up pre-release tests on genetically modified organisms. Assessment of ‘substantial equivalence’ concept that established by comparison of genetically modified organism with a comparator with a history of safe use could be the first step of a comprehensive risk assessment. Metabolite level is the richest in performance of changes which stem from genetic or environmental factors. Since assessment of all metabolites in detail is very costly and practically impossible, statistical evaluation of processed data of grain spectroscopic values could be a time and cost effective substitution for complex chemical analysis. To investigate the ability of multivariate statistical techniques in comparison of metabolomes as well as testing a method for such comparisons with available tools, a transgenic rice in combination with its traditionally bred parent were used as test material, and the discriminant analysis were applied as supervised method and principal component analysis as unsupervised classification method on the processed data which were extracted from Fourier transform infrared spectroscopy and nuclear magnetic resonance spectral data of powdered rice and rice extraction and barley grain samples, of which the latter was considered as control. The results confirmed the capability of statistics, even with initial data processing applications in metabolome studies. Meanwhile, this study confirms that the supervised method results in more distinctive results.

  10. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    Science.gov (United States)

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable

  11. COMPARISON OF STATISTICALLY CONTROLLED MACHINING SOLUTIONS OF TITANIUM ALLOYS USING USM

    Directory of Open Access Journals (Sweden)

    R. Singh

    2010-06-01

    Full Text Available The purpose of the present investigation is to compare the statistically controlled machining solution of titanium alloys using ultrasonic machining (USM. In this study, the previously developed Taguchi model for USM of titanium and its alloys has been investigated and compared. Relationships between the material removal rate, tool wear rate, surface roughness and other controllable machining parameters (power rating, tool type, slurry concentration, slurry type, slurry temperature and slurry size have been deduced. The results of this study suggest that at the best settings of controllable machining parameters for titanium alloys (based upon the Taguchi design, the machining solution with USM is statistically controlled, which is not observed for other settings of input parameters on USM.

  12. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    DEFF Research Database (Denmark)

    Tataru, Paula Cristina; Hobolth, Asger

    2011-01-01

    past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. RESULTS: We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned......BACKGROUND: Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications...... of the algorithms is available at www.birc.au.dk/~paula/. CONCLUSIONS: We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually...

  13. Comparison of statistical tests for association between rare variants and binary traits.

    OpenAIRE

    Bacanu, SA; Nelson, MR; Whittaker, JC

    2012-01-01

    : Genome-wide association studies have found thousands of common genetic variants associated with a wide variety of diseases and other complex traits. However, a large portion of the predicted genetic contribution to many traits remains unknown. One plausible explanation is that some of the missing variation is due to the effects of rare variants. Nonetheless, the statistical analysis of rare variants is challenging. A commonly used method is to contrast, within the same region (gene), the fr...

  14. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Ren, Qingguo; Dewan, Sheilesh Kumar; Li, Ming; Li, Jianying; Mao, Dingbiao; Wang, Zhenglei; Hua, Yanqing

    2012-01-01

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  15. Performance comparison of multi-detector detection statistics in targeted compact binary coalescence GW search

    OpenAIRE

    Haris, K; Pai, Archana

    2016-01-01

    Global network of advanced Interferometric gravitational wave (GW) detectors are expected to be on-line soon. Coherent observation of GW from a distant compact binary coalescence (CBC) with a network of interferometers located in different continents give crucial information about the source such as source location and polarization information. In this paper we compare different multi-detector network detection statistics for CBC search. In maximum likelihood ratio (MLR) based detection appro...

  16. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  17. A comparison of statistical methods for identifying out-of-date systematic reviews.

    Directory of Open Access Journals (Sweden)

    Porjai Pattanittum

    Full Text Available BACKGROUND: Systematic reviews (SRs can provide accurate and reliable evidence, typically about the effectiveness of health interventions. Evidence is dynamic, and if SRs are out-of-date this information may not be useful; it may even be harmful. This study aimed to compare five statistical methods to identify out-of-date SRs. METHODS: A retrospective cohort of SRs registered in the Cochrane Pregnancy and Childbirth Group (CPCG, published between 2008 and 2010, were considered for inclusion. For each eligible CPCG review, data were extracted and "3-years previous" meta-analyses were assessed for the need to update, given the data from the most recent 3 years. Each of the five statistical methods was used, with random effects analyses throughout the study. RESULTS: Eighty reviews were included in this study; most were in the area of induction of labour. The numbers of reviews identified as being out-of-date using the Ottawa, recursive cumulative meta-analysis (CMA, and Barrowman methods were 34, 7, and 7 respectively. No reviews were identified as being out-of-date using the simulation-based power method, or the CMA for sufficiency and stability method. The overall agreement among the three discriminating statistical methods was slight (Kappa = 0.14; 95% CI 0.05 to 0.23. The recursive cumulative meta-analysis, Ottawa, and Barrowman methods were practical according to the study criteria. CONCLUSION: Our study shows that three practical statistical methods could be applied to examine the need to update SRs.

  18. A global approach to estimate irrigated areas - a comparison between different data and statistics

    Science.gov (United States)

    Meier, Jonas; Zabel, Florian; Mauser, Wolfram

    2018-02-01

    Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.

  19. Performance in College Chemistry: a Statistical Comparison Using Gender and Jungian Personality Type

    Science.gov (United States)

    Greene, Susan V.; Wheeler, Henry R.; Riley, Wayne D.

    This study sorted college introductory chemistry students by gender and Jungian personality type. It recognized differences from the general population distribution and statistically compared the students' grades with their Jungian personality types. Data from 577 female students indicated that ESFP (extroverted, sensory, feeling, perceiving) and ENFP (extroverted, intuitive, feeling, perceiving) profiles performed poorly at statistically significant levels when compared with the distribution of females enrolled in introductory chemistry. The comparable analysis using data from 422 male students indicated that the poorly performing male profiles were ISTP (introverted, sensory, thinking, perceiving) and ESTP (extroverted, sensory, thinking, perceiving). ESTJ (extroverted, sensory, thinking, judging) female students withdrew from the course at a statistically significant level. For both genders, INTJ (introverted, intuitive, thinking, judging) students were the best performers. By examining the documented characteristics of Jungian profiles that correspond with poorly performing students in chemistry, one may more effectively assist the learning process and the retention of these individuals in the fields of natural science, engineering, and technology.

  20. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  1. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    Science.gov (United States)

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  2. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    Science.gov (United States)

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  3. Tract-oriented statistical group comparison of diffusion in sheet-like white matter

    DEFF Research Database (Denmark)

    Lyksborg, Mark; Dyrby, T. B.; Sorensen, P. S.

    2013-01-01

    tube-like shapes, not always suitable for modelling the white matter tracts of the brain. The tract-oriented technique aimed at group studies, integrates the usage of multivariate features and outputs a single value of significance indicating tract-specific differences. This is in contrast to voxel...... based analysis techniques which outputs a significance per voxel basis, and requires multiple comparison correction. We demonstrate our technique by comparing a group of controls with a group of Multiple Sclerosis subjects obtaining significant differences on 11 different fascicle structures....

  4. A comparison of dynamical and statistical downscaling methods for regional wave climate projections along French coastlines.

    Science.gov (United States)

    Laugel, Amélie; Menendez, Melisa; Benoit, Michel; Mattarolo, Giovanni; Mendez, Fernando

    2013-04-01

    Wave climate forecasting is a major issue for numerous marine and coastal related activities, such as offshore industries, flooding risks assessment and wave energy resource evaluation, among others. Generally, there are two main ways to predict the impacts of the climate change on the wave climate at regional scale: the dynamical and the statistical downscaling of GCM (Global Climate Model). In this study, both methods have been applied on the French coast (Atlantic , English Channel and North Sea shoreline) under three climate change scenarios (A1B, A2, B1) simulated with the GCM ARPEGE-CLIMAT, from Météo-France (AR4, IPCC). The aim of the work is to characterise the wave climatology of the 21st century and compare the statistical and dynamical methods pointing out advantages and disadvantages of each approach. The statistical downscaling method proposed by the Environmental Hydraulics Institute of Cantabria (Spain) has been applied (Menendez et al., 2011). At a particular location, the sea-state climate (Predictand Y) is defined as a function, Y=f(X), of several atmospheric circulation patterns (Predictor X). Assuming these climate associations between predictor and predictand are stationary, the statistical approach has been used to project the future wave conditions with reference to the GCM. The statistical relations between predictor and predictand have been established over 31 years, from 1979 to 2009. The predictor is built as the 3-days-averaged squared sea level pressure gradient from the hourly CFSR database (Climate Forecast System Reanalysis, http://cfs.ncep.noaa.gov/cfsr/). The predictand has been extracted from the 31-years hindcast sea-state database ANEMOC-2 performed with the 3G spectral wave model TOMAWAC (Benoit et al., 1996), developed at EDF R&D LNHE and Saint-Venant Laboratory for Hydraulics and forced by the CFSR 10m wind field. Significant wave height, peak period and mean wave direction have been extracted with an hourly-resolution at

  5. STATISTICAL ANALYSIS OF FLARING LOOPS OBSERVED BY NOBEYAMA RADIOHELIOGRAPH. I. COMPARISON OF LOOPTOP AND FOOTPOINTS

    International Nuclear Information System (INIS)

    Huang Guangli; Nakajima, Hiroshi

    2009-01-01

    Twenty-four events with looplike structures at 17 and 34 GHz are selected from the flare list of Nobeyama Radioheliograph. We obtained the brightness temperatures at 17 and 34 GHz, the polarization degrees at 17 GHz, and the power-law spectral indices at the radio peak time for one looptop (LT) and two footpoints (FPs) of each event. We also calculated the magnetic field strengths and the column depths of nonthermal electrons in the LT and FPs of each event, using the equations modified from the gyrosynchrotron equations by Dulk. The main statistical results from those data are summarized as follows. (1) The spectral indices, the brightness temperatures at 17 and 34 GHz, the polarization degrees at 17 GHz, the calculated magnetic field strengths, and the calculated column densities of nonthermal electrons are always positively correlated between the LT and the two FPs of the selected events. (2) About one-half of the events have the brightest LT at 17 and 34 GHz. (3) The spectral indices in the two FPs are larger (softer) than those in the corresponding LT in most events. (4) The calculated magnetic field strengths in the two FPs are always larger than those in the corresponding LT. (5) Most of the events have the same positive or negative polarization sense in the LT and the two FPs. (6) The brightness temperatures at 17 and 34 GHz in each of the LT and the two FPs statistically decrease with their spectral indices and the calculated magnetic field strengths, but increase with their calculated column densities of nonthermal electrons. Moreover, we try to discuss the possible causes of the present statistical results.

  6. Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting

    Science.gov (United States)

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2018-02-01

    In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.

  7. Comparison of classical statistical methods and artificial neural network in traffic noise prediction

    International Nuclear Information System (INIS)

    Nedic, Vladimir; Despotovic, Danijela; Cvetanovic, Slobodan; Despotovic, Milan; Babic, Sasa

    2014-01-01

    Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. The output variable of the network is the equivalent noise level in the given time period L eq . Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model

  8. Computational and analytical comparison of flux discretizations for the semiconductor device equations beyond Boltzmann statistics

    International Nuclear Information System (INIS)

    Farrell, Patricio; Koprucki, Thomas; Fuhrmann, Jürgen

    2017-01-01

    We compare three thermodynamically consistent numerical fluxes known in the literature, appearing in a Voronoï finite volume discretization of the van Roosbroeck system with general charge carrier statistics. Our discussion includes an extension of the Scharfetter–Gummel scheme to non-Boltzmann (e.g. Fermi–Dirac) statistics. It is based on the analytical solution of a two-point boundary value problem obtained by projecting the continuous differential equation onto the interval between neighboring collocation points. Hence, it serves as a reference flux. The exact solution of the boundary value problem can be approximated by computationally cheaper fluxes which modify certain physical quantities. One alternative scheme averages the nonlinear diffusion (caused by the non-Boltzmann nature of the problem), another one modifies the effective density of states. To study the differences between these three schemes, we analyze the Taylor expansions, derive an error estimate, visualize the flux error and show how the schemes perform for a carefully designed p-i-n benchmark simulation. We present strong evidence that the flux discretization based on averaging the nonlinear diffusion has an edge over the scheme based on modifying the effective density of states.

  9. Computational and analytical comparison of flux discretizations for the semiconductor device equations beyond Boltzmann statistics

    Science.gov (United States)

    Farrell, Patricio; Koprucki, Thomas; Fuhrmann, Jürgen

    2017-10-01

    We compare three thermodynamically consistent numerical fluxes known in the literature, appearing in a Voronoï finite volume discretization of the van Roosbroeck system with general charge carrier statistics. Our discussion includes an extension of the Scharfetter-Gummel scheme to non-Boltzmann (e.g. Fermi-Dirac) statistics. It is based on the analytical solution of a two-point boundary value problem obtained by projecting the continuous differential equation onto the interval between neighboring collocation points. Hence, it serves as a reference flux. The exact solution of the boundary value problem can be approximated by computationally cheaper fluxes which modify certain physical quantities. One alternative scheme averages the nonlinear diffusion (caused by the non-Boltzmann nature of the problem), another one modifies the effective density of states. To study the differences between these three schemes, we analyze the Taylor expansions, derive an error estimate, visualize the flux error and show how the schemes perform for a carefully designed p-i-n benchmark simulation. We present strong evidence that the flux discretization based on averaging the nonlinear diffusion has an edge over the scheme based on modifying the effective density of states.

  10. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    Science.gov (United States)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  11. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Science.gov (United States)

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains.

    Science.gov (United States)

    Tataru, Paula; Hobolth, Asger

    2011-12-05

    Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD), the second on uniformization (UNI), and the third on integrals of matrix exponentials (EXPM). The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  13. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    Directory of Open Access Journals (Sweden)

    Tataru Paula

    2011-12-01

    Full Text Available Abstract Background Continuous time Markov chains (CTMCs is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes are unaccessible and the past must be inferred from DNA sequence data observed in the present. Results We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD, the second on uniformization (UNI, and the third on integrals of matrix exponentials (EXPM. The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. Conclusions We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  14. Comparison of classical statistical methods and artificial neural network in traffic noise prediction

    Energy Technology Data Exchange (ETDEWEB)

    Nedic, Vladimir, E-mail: vnedic@kg.ac.rs [Faculty of Philology and Arts, University of Kragujevac, Jovana Cvijića bb, 34000 Kragujevac (Serbia); Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs [Faculty of Economics, University of Kragujevac, Djure Pucara Starog 3, 34000 Kragujevac (Serbia); Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs [Faculty of Economics, University of Niš, Trg kralja Aleksandra Ujedinitelja, 18000 Niš (Serbia); Despotovic, Milan, E-mail: mdespotovic@kg.ac.rs [Faculty of Engineering, University of Kragujevac, Sestre Janjic 6, 34000 Kragujevac (Serbia); Babic, Sasa, E-mail: babicsf@yahoo.com [College of Applied Mechanical Engineering, Trstenik (Serbia)

    2014-11-15

    Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. The output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.

  15. A comparison of linear and nonlinear statistical techniques in performance attribution.

    Science.gov (United States)

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  16. Mapping patent classifications: portfolio and statistical analysis, and the comparison of strengths and weaknesses.

    Science.gov (United States)

    Leydesdorff, Loet; Kogler, Dieter Franz; Yan, Bowen

    2017-01-01

    The Cooperative Patent Classifications (CPC) recently developed cooperatively by the European and US Patent Offices provide a new basis for mapping patents and portfolio analysis. CPC replaces International Patent Classifications (IPC) of the World Intellectual Property Organization. In this study, we update our routines previously based on IPC for CPC and use the occasion for rethinking various parameter choices. The new maps are significantly different from the previous ones, although this may not always be obvious on visual inspection. We provide nested maps online and a routine for generating portfolio overlays on the maps; a new tool is provided for "difference maps" between patent portfolios of organizations or firms. This is illustrated by comparing the portfolios of patents granted to two competing firms-Novartis and MSD-in 2016. Furthermore, the data is organized for the purpose of statistical analysis.

  17. Inter-comparison of statistical downscaling methods for projection of extreme flow indices across Europe

    DEFF Research Database (Denmark)

    Hundecha, Yeshewatesfa; Sunyer Pinya, Maria Antonia; Lawrence, Deborah

    2016-01-01

    The effect of methods of statistical downscaling of daily precipitation on changes in extreme flow indices under a plausible future climate change scenario was investigated in 11 catchments selected from 9 countries in different parts of Europe. The catchments vary from 67 to 6171km2 in size...... catchments to simulate daily runoff. A set of flood indices were derived from daily flows and their changes have been evaluated by comparing their values derived from simulations corresponding to the current and future climate. Most of the implemented downscaling methods project an increase in the extreme...... flow indices in most of the catchments. The catchments where the extremes are expected to increase have a rainfall-dominated flood regime. In these catchments, the downscaling methods also project an increase in the extreme precipitation in the seasons when the extreme flows occur. In catchments where...

  18. Statistical Analysis and Comparison of Harmonics Measured in Offshore Wind Farms

    DEFF Research Database (Denmark)

    Kocewiak, Lukasz Hubert; Hjerrild, Jesper; Bak, Claus Leth

    2011-01-01

    The paper shows statistical analysis of harmonic components measured in different offshore wind farms. Harmonic analysis is a complex task and requires many aspects, such as measurements, data processing, modeling, validation, to be taken into consideration. The paper describes measurement process...... and shows sophisticated analysis on representative harmonic measurements from Avedøre Holme, Gunfleet Sands and Burbo Bank wind farms. The nature of generation and behavior of harmonic components in offshore wind farms clearly presented and explained based on probabilistic approach. Some issues regarding...... commonly applied standards are also put forward in the discussion. Based on measurements and data analysis it is shown that a general overview about wind farm harmonic behaviour cannot be fully observed only based on single-value measurements as suggested in the standards but using more descriptive...

  19. Seasonal forecasting of hydrological drought in the Limpopo Basin: a comparison of statistical methods

    Science.gov (United States)

    Seibert, Mathias; Merz, Bruno; Apel, Heiko

    2017-03-01

    The Limpopo Basin in southern Africa is prone to droughts which affect the livelihood of millions of people in South Africa, Botswana, Zimbabwe and Mozambique. Seasonal drought early warning is thus vital for the whole region. In this study, the predictability of hydrological droughts during the main runoff period from December to May is assessed using statistical approaches. Three methods (multiple linear models, artificial neural networks, random forest regression trees) are compared in terms of their ability to forecast streamflow with up to 12 months of lead time. The following four main findings result from the study. 1. There are stations in the basin at which standardised streamflow is predictable with lead times up to 12 months. The results show high inter-station differences of forecast skill but reach a coefficient of determination as high as 0.73 (cross validated). 2. A large range of potential predictors is considered in this study, comprising well-established climate indices, customised teleconnection indices derived from sea surface temperatures and antecedent streamflow as a proxy of catchment conditions. El Niño and customised indices, representing sea surface temperature in the Atlantic and Indian oceans, prove to be important teleconnection predictors for the region. Antecedent streamflow is a strong predictor in small catchments (with median 42 % explained variance), whereas teleconnections exert a stronger influence in large catchments. 3. Multiple linear models show the best forecast skill in this study and the greatest robustness compared to artificial neural networks and random forest regression trees, despite their capabilities to represent nonlinear relationships. 4. Employed in early warning, the models can be used to forecast a specific drought level. Even if the coefficient of determination is low, the forecast models have a skill better than a climatological forecast, which is shown by analysis of receiver operating characteristics

  20. Statistical inference methods for two crossing survival curves: a comparison of methods.

    Science.gov (United States)

    Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng

    2015-01-01

    A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.

  1. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline.

    Science.gov (United States)

    Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C

    2013-12-21

    As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.

  2. How Many Subjects are Needed for a Visual Field Normative Database? A Comparison of Ground Truth and Bootstrapped Statistics.

    Science.gov (United States)

    Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K

    2018-03-01

    The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.

  3. Detecting changes in ultrasound backscattered statistics by using Nakagami parameters: Comparisons of moment-based and maximum likelihood estimators.

    Science.gov (United States)

    Lin, Jen-Jen; Cheng, Jung-Yu; Huang, Li-Fei; Lin, Ying-Hsiu; Wan, Yung-Liang; Tsui, Po-Hsiang

    2017-05-01

    The Nakagami distribution is an approximation useful to the statistics of ultrasound backscattered signals for tissue characterization. Various estimators may affect the Nakagami parameter in the detection of changes in backscattered statistics. In particular, the moment-based estimator (MBE) and maximum likelihood estimator (MLE) are two primary methods used to estimate the Nakagami parameters of ultrasound signals. This study explored the effects of the MBE and different MLE approximations on Nakagami parameter estimations. Ultrasound backscattered signals of different scatterer number densities were generated using a simulation model, and phantom experiments and measurements of human liver tissues were also conducted to acquire real backscattered echoes. Envelope signals were employed to estimate the Nakagami parameters by using the MBE, first- and second-order approximations of MLE (MLE 1 and MLE 2 , respectively), and Greenwood approximation (MLE gw ) for comparisons. The simulation results demonstrated that, compared with the MBE and MLE 1 , the MLE 2 and MLE gw enabled more stable parameter estimations with small sample sizes. Notably, the required data length of the envelope signal was 3.6 times the pulse length. The phantom and tissue measurement results also showed that the Nakagami parameters estimated using the MLE 2 and MLE gw could simultaneously differentiate various scatterer concentrations with lower standard deviations and reliably reflect physical meanings associated with the backscattered statistics. Therefore, the MLE 2 and MLE gw are suggested as estimators for the development of Nakagami-based methodologies for ultrasound tissue characterization. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Judging statistical models of individual decision making under risk using in- and out-of-sample criteria.

    Science.gov (United States)

    Drichoutis, Andreas C; Lusk, Jayson L

    2014-01-01

    Despite the fact that conceptual models of individual decision making under risk are deterministic, attempts to econometrically estimate risk preferences require some assumption about the stochastic nature of choice. Unfortunately, the consequences of making different assumptions are, at present, unclear. In this paper, we compare three popular error specifications (Fechner, contextual utility, and Luce error) for three different preference functionals (expected utility, rank-dependent utility, and a mixture of those two) using in- and out-of-sample selection criteria. We find drastically different inferences about structural risk preferences across the competing functionals and error specifications. Expected utility theory is least affected by the selection of the error specification. A mixture model combining the two conceptual models assuming contextual utility provides the best fit of the data both in- and out-of-sample.

  5. Judging statistical models of individual decision making under risk using in- and out-of-sample criteria.

    Directory of Open Access Journals (Sweden)

    Andreas C Drichoutis

    Full Text Available Despite the fact that conceptual models of individual decision making under risk are deterministic, attempts to econometrically estimate risk preferences require some assumption about the stochastic nature of choice. Unfortunately, the consequences of making different assumptions are, at present, unclear. In this paper, we compare three popular error specifications (Fechner, contextual utility, and Luce error for three different preference functionals (expected utility, rank-dependent utility, and a mixture of those two using in- and out-of-sample selection criteria. We find drastically different inferences about structural risk preferences across the competing functionals and error specifications. Expected utility theory is least affected by the selection of the error specification. A mixture model combining the two conceptual models assuming contextual utility provides the best fit of the data both in- and out-of-sample.

  6. Making Statistical Data More Easily Accessible on the Web Results of the StatSearch Case Study

    CERN Document Server

    Rajman, M; Boynton, I M; Fridlund, B; Fyhrlund, A; Sundgren, B; Lundquist, P; Thelander, H; Wänerskär, M

    2005-01-01

    In this paper we present the results of the StatSearch case study that aimed at providing an enhanced access to statistical data available on the Web. In the scope of this case study we developed a prototype of an information access tool combining a query-based search engine with semi-automated navigation techniques exploiting the hierarchical structuring of the available data. This tool enables a better control of the information retrieval, improving the quality and ease of the access to statistical information. The central part of the presented StatSearch tool consists in the design of an algorithm for automated navigation through a tree-like hierarchical document structure. The algorithm relies on the computation of query related relevance score distributions over the available database to identify the most relevant clusters in the data structure. These most relevant clusters are then proposed to the user for navigation, or, alternatively, are the support for the automated navigation process. Several appro...

  7. An empirical comparison of key statistical attributes among potential ICU quality indicators.

    Science.gov (United States)

    Brown, Sydney E S; Ratcliffe, Sarah J; Halpern, Scott D

    2014-08-01

    Good quality indicators should have face validity, relevance to patients, and be able to be measured reliably. Beyond these general requirements, good quality indicators should also have certain statistical properties, including sufficient variability to identify poor performers, relative insensitivity to severity adjustment, and the ability to capture what providers do rather than patients' characteristics. We assessed the performance of candidate indicators of ICU quality on these criteria. Indicators included ICU readmission, mortality, several length of stay outcomes, and the processes of venous-thromboembolism and stress ulcer prophylaxis provision. Retrospective cohort study. One hundred thirty-eight U.S. ICUs from 2001-2008 in the Project IMPACT database. Two hundred sixty-eight thousand eight hundred twenty-four patients discharged from U.S. ICUs. None. We assessed indicators' (1) variability across ICU-years; (2) degree of influence by patient vs. ICU and hospital characteristics using the Omega statistic; (3) sensitivity to severity adjustment by comparing the area under the receiver operating characteristic curve (AUC) between models including vs. excluding patient variables, and (4) correlation between risk adjusted quality indicators using a Spearman correlation. Large ranges of among-ICU variability were noted for all quality indicators, particularly for prolonged length of stay (4.7-71.3%) and the proportion of patients discharged home (30.6-82.0%), and ICU and hospital characteristics outweighed patient characteristics for stress ulcer prophylaxis (ω, 0.43; 95% CI, 0.34-0.54), venous thromboembolism prophylaxis (ω, 0.57; 95% CI, 0.53-0.61), and ICU readmissions (ω, 0.69; 95% CI, 0.52-0.90). Mortality measures were the most sensitive to severity adjustment (area under the receiver operating characteristic curve % difference, 29.6%); process measures were the least sensitive (area under the receiver operating characteristic curve % differences

  8. Audio Classification in Speech and Music: A Comparison between a Statistical and a Neural Approach

    Directory of Open Access Journals (Sweden)

    Alessandro Bugatti

    2002-04-01

    Full Text Available We focus the attention on the problem of audio classification in speech and music for multimedia applications. In particular, we present a comparison between two different techniques for speech/music discrimination. The first method is based on Zero crossing rate and Bayesian classification. It is very simple from a computational point of view, and gives good results in case of pure music or speech. The simulation results show that some performance degradation arises when the music segment contains also some speech superimposed on music, or strong rhythmic components. To overcome these problems, we propose a second method, that uses more features, and is based on neural networks (specifically a multi-layer Perceptron. In this case we obtain better performance, at the expense of a limited growth in the computational complexity. In practice, the proposed neural network is simple to be implemented if a suitable polynomial is used as the activation function, and a real-time implementation is possible even if low-cost embedded systems are used.

  9. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo; Lee, Jae Sung

    2002-01-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5±3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis

  10. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo [Ajou University School of Medicine, Suwon (Korea, Republic of); Lee, Jae Sung [Seoul national University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5{+-}3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis.

  11. Statistical utility theory for comparison of nuclear versus fossil power plant alternatives

    International Nuclear Information System (INIS)

    Garribba, S.; Ovi, A.

    1977-01-01

    A statistical formulation of utility theory is developed for decision problems concerned with the choice among alternative strategies in electric energy production. Four alternatives are considered: nuclear power, fossil power, solar energy, and conservation policy. Attention is focused on a public electric utility thought of as a rational decision-maker. A framework for decisions is then suggested where the admissible strategies and their possible consequences represent the information available to the decision-maker. Once the objectives of the decision process are assessed, consequences can be quantified in terms of measures of effectiveness. Maximum expected utility is the criterion of choice among alternatives. Steps toward expected values are the evaluation of the multidimensional utility function and the assessment of subjective probabilities for consequences. In this respect, the multiplicative form of the utility function seems less restrictive than the additive form and almost as manageable to implement. Probabilities are expressed through subjective marginal probability density functions given at a discrete number of points. The final stage of the decision model is to establish the value of each strategy. To this scope, expected utilities are computed and scaled. The result is that nuclear power offers the best alternative. 8 figures, 9 tables, 32 references

  12. Statistical inference and comparison of stochastic models for the hydraulic conductivity at the Finnsjoen-site

    International Nuclear Information System (INIS)

    Norman, S.

    1992-04-01

    The origin of this study was to find a good, or even the best, stochastic model for the hydraulic conductivity field at the Finnsjoe site. The conductivity field in question are regularized, that is upscaled. The reason for performing regularization of measurement data is primarily the need for long correlation scales. This is needed in order to model reasonably large domains that can be used when describing regional groundwater flow accurately. A theory of regularization is discussed in this report. In order to find the best model, jacknifing is employed to compare different stochastic models. The theory for this method is described. In the act of doing so we also take a look at linear predictor theory, so called kriging, and include a general discussion of stochastic functions and intrinsic random functions. The statistical inference methods for finding the models are also described, in particular regression, iterative generalized regression (IGLSE) and non-parametric variogram estimators. A large amount of results is presented for a regularization scale of 36 metre. (30 refs.) (au)

  13. Right adrenal vein: comparison between adaptive statistical iterative reconstruction and model-based iterative reconstruction.

    Science.gov (United States)

    Noda, Y; Goshima, S; Nagata, S; Miyoshi, T; Kawada, H; Kawai, N; Tanahashi, Y; Matsuo, M

    2018-06-01

    To compare right adrenal vein (RAV) visualisation and contrast enhancement degree on adrenal venous phase images reconstructed using adaptive statistical iterative reconstruction (ASiR) and model-based iterative reconstruction (MBIR) techniques. This prospective study was approved by the institutional review board, and written informed consent was waived. Fifty-seven consecutive patients who underwent adrenal venous phase imaging were enrolled. The same raw data were reconstructed using ASiR 40% and MBIR. The expert and beginner independently reviewed computed tomography (CT) images. RAV visualisation rates, background noise, and CT attenuation of the RAV, right adrenal gland, inferior vena cava (IVC), hepatic vein, and bilateral renal veins were compared between the two reconstruction techniques. RAV visualisation rates were higher with MBIR than with ASiR (95% versus 88%, p=0.13 in expert and 93% versus 75%, p=0.002 in beginner, respectively). RAV visualisation confidence ratings with MBIR were significantly greater than with ASiR (pASiR (pASiR (p=0.0013 and 0.02). Reconstruction of adrenal venous phase images using MBIR significantly reduces background noise, leading to an improvement in the RAV visualisation compared with ASiR. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  14. Comparison of PIV and Hot-Wire statistics of turbulent boundary layer

    International Nuclear Information System (INIS)

    Dróżdż, A; Uruba, V

    2014-01-01

    The paper shows a cross checking of turbulent boundary layer measurements using large field of view PIV and hot-wire anemometry techniques. The time-resolved PIV method was used for the experiments. The measuring plane was oriented perpendicularly to the wall and parallel to the mean flow. Hot wire measurement has been performed using the special probe with perpendicular hot wire. The HW point measurements were performed in the same place as PIV experiments. The hot-wire probe has the wire length of l + < 20 in considered range of Reynolds numbers. Various evaluation methods were applied on PIV data. The profiles of statistical characteristics of streamwise velocity components were evaluated from the data. Mean values, standard deviations as well as skewness and kurtosis coefficients were compared for a few values of Re θ . Reynolds number ranges from 1000 to 5500. The result shows that with the increasing Reynolds number the attenuation of fluctuations maximum in PIV measurements occurs with respect to Hot-Wire measurements, however representation of velocity fluctuations using the PIV method is satisfactory. The influence of wall-normal fluctuation component on Hot-Wire near wall peak was also investigated.

  15. Comparison of Statistical Algorithms for the Detection of Infectious Disease Outbreaks in Large Multiple Surveillance Systems

    Science.gov (United States)

    Farrington, C. Paddy; Noufaily, Angela; Andrews, Nick J.; Charlett, Andre

    2016-01-01

    A large-scale multiple surveillance system for infectious disease outbreaks has been in operation in England and Wales since the early 1990s. Changes to the statistical algorithm at the heart of the system were proposed and the purpose of this paper is to compare two new algorithms with the original algorithm. Test data to evaluate performance are created from weekly counts of the number of cases of each of more than 2000 diseases over a twenty-year period. The time series of each disease is separated into one series giving the baseline (background) disease incidence and a second series giving disease outbreaks. One series is shifted forward by twelve months and the two are then recombined, giving a realistic series in which it is known where outbreaks have been added. The metrics used to evaluate performance include a scoring rule that appropriately balances sensitivity against specificity and is sensitive to variation in probabilities near 1. In the context of disease surveillance, a scoring rule can be adapted to reflect the size of outbreaks and this was done. Results indicate that the two new algorithms are comparable to each other and better than the algorithm they were designed to replace. PMID:27513749

  16. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    Science.gov (United States)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  17. Statistical comparisons of gravity wave features derived from OH airglow and SABER data

    Science.gov (United States)

    Gelinas, L. J.; Hecht, J. H.; Walterscheid, R. L.

    2017-12-01

    The Aerospace Corporation's near-IR camera (ANI), deployed at Andes Lidar Observatory (ALO), Cerro Pachon Chile (30S,70W) since 2010, images the bright OH Meinel (4,2) airglow band. The imager provides detailed observations of gravity waves and instability dynamics, as described by Hecht et al. (2014). The camera employs a wide-angle lens that views a 73 by 73 degree region of the sky, approximately 120 km x 120 km at 85 km altitude. Image cadence of 30s allows for detailed spectral analysis of the horizontal components of wave features, including the evolution and decay of instability features. The SABER instrument on NASA's TIMED spacecraft provides remote soundings of kinetic temperature profiles from the lower stratosphere to the lower thermosphere. Horizontal and vertical filtering techniques allow SABER temperatures to be analyzed for gravity wave variances [Walterscheid and Christensen, 2016]. Here we compare the statistical characteristics of horizontal wave spectra, derived from airglow imagery, with vertical wave variances derived from SABER temperature profiles. The analysis is performed for a period of strong mountain wave activity over the Andes spanning the period between June and September 2012. Hecht, J. H., et al. (2014), The life cycle of instability features measured from the Andes Lidar Observatory over Cerro Pachon on March 24, 2012, J. Geophys. Res. Atmos., 119, 8872-8898, doi:10.1002/2014JD021726. Walterscheid, R. L., and A. B. Christensen (2016), Low-latitude gravity wave variances in the mesosphere and lower thermosphere derived from SABER temperature observation and compared with model simulation of waves generated by deep tropical convection, J. Geophys. Res. Atmos., 121, 11,900-11,912, doi:10.1002/2016JD024843.

  18. Comparison of statistical and clinical predictions of functional outcome after ischemic stroke.

    Directory of Open Access Journals (Sweden)

    Douglas D Thompson

    Full Text Available To determine whether the predictions of functional outcome after ischemic stroke made at the bedside using a doctor's clinical experience were more or less accurate than the predictions made by clinical prediction models (CPMs.A prospective cohort study of nine hundred and thirty one ischemic stroke patients recruited consecutively at the outpatient, inpatient and emergency departments of the Western General Hospital, Edinburgh between 2002 and 2005. Doctors made informal predictions of six month functional outcome on the Oxford Handicap Scale (OHS. Patients were followed up at six months with a validated postal questionnaire. For each patient we calculated the absolute predicted risk of death or dependence (OHS≥3 using five previously described CPMs. The specificity of a doctor's informal predictions of OHS≥3 at six months was good 0.96 (95% CI: 0.94 to 0.97 and similar to CPMs (range 0.94 to 0.96; however the sensitivity of both informal clinical predictions 0.44 (95% CI: 0.39 to 0.49 and clinical prediction models (range 0.38 to 0.45 was poor. The prediction of the level of disability after stroke was similar for informal clinical predictions (ordinal c-statistic 0.74 with 95% CI 0.72 to 0.76 and CPMs (range 0.69 to 0.75. No patient or clinician characteristic affected the accuracy of informal predictions, though predictions were more accurate in outpatients.CPMs are at least as good as informal clinical predictions in discriminating between good and bad functional outcome after ischemic stroke. The place of these models in clinical practice has yet to be determined.

  19. A Comparison of Selected Statistical Techniques to Model Soil Cation Exchange Capacity

    Science.gov (United States)

    Khaledian, Yones; Brevik, Eric C.; Pereira, Paulo; Cerdà, Artemi; Fattah, Mohammed A.; Tazikeh, Hossein

    2017-04-01

    Cation exchange capacity (CEC) measures the soil's ability to hold positively charged ions and is an important indicator of soil quality (Khaledian et al., 2016). However, other soil properties are more commonly determined and reported, such as texture, pH, organic matter and biology. We attempted to predict CEC using different advanced statistical methods including monotone analysis of variance (MONANOVA), artificial neural networks (ANNs), principal components regressions (PCR), and particle swarm optimization (PSO) in order to compare the utility of these approaches and identify the best predictor. We analyzed 170 soil samples from four different nations (USA, Spain, Iran and Iraq) under three land uses (agriculture, pasture, and forest). Seventy percent of the samples (120 samples) were selected as the calibration set and the remaining 50 samples (30%) were used as the prediction set. The results indicated that the MONANOVA (R2= 0.82 and Root Mean Squared Error (RMSE) =6.32) and ANNs (R2= 0.82 and RMSE=5.53) were the best models to estimate CEC, PSO (R2= 0.80 and RMSE=5.54) and PCR (R2= 0.70 and RMSE=6.48) also worked well and the overall results were very similar to each other. Clay (positively correlated) and sand (negatively correlated) were the most influential variables for predicting CEC for the entire data set, while the most influential variables for the various countries and land uses were different and CEC was affected by different variables in different situations. Although the MANOVA and ANNs provided good predictions of the entire dataset, PSO gives a formula to estimate soil CEC using commonly tested soil properties. Therefore, PSO shows promise as a technique to estimate soil CEC. Establishing effective pedotransfer functions to predict CEC would be productive where there are limitations of time and money, and other commonly analyzed soil properties are available. References Khaledian, Y., Kiani, F., Ebrahimi, S., Brevik, E.C., Aitkenhead

  20. A comparison of statistical methods for genomic selection in a mice population

    Directory of Open Access Journals (Sweden)

    Neves Haroldo HR

    2012-11-01

    Full Text Available Abstract Background The availability of high-density panels of SNP markers has opened new perspectives for marker-assisted selection strategies, such that genotypes for these markers are used to predict the genetic merit of selection candidates. Because the number of markers is often much larger than the number of phenotypes, marker effect estimation is not a trivial task. The objective of this research was to compare the predictive performance of ten different statistical methods employed in genomic selection, by analyzing data from a heterogeneous stock mice population. Results For the five traits analyzed (W6W: weight at six weeks, WGS: growth slope, BL: body length, %CD8+: percentage of CD8+ cells, CD4+/ CD8+: ratio between CD4+ and CD8+ cells, within-family predictions were more accurate than across-family predictions, although this superiority in accuracy varied markedly across traits. For within-family prediction, two kernel methods, Reproducing Kernel Hilbert Spaces Regression (RKHS and Support Vector Regression (SVR, were the most accurate for W6W, while a polygenic model also had comparable performance. A form of ridge regression assuming that all markers contribute to the additive variance (RR_GBLUP figured among the most accurate for WGS and BL, while two variable selection methods ( LASSO and Random Forest, RF had the greatest predictive abilities for %CD8+ and CD4+/ CD8+. RF, RKHS, SVR and RR_GBLUP outperformed the remainder methods in terms of bias and inflation of predictions. Conclusions Methods with large conceptual differences reached very similar predictive abilities and a clear re-ranking of methods was observed in function of the trait analyzed. Variable selection methods were more accurate than the remainder in the case of %CD8+ and CD4+/CD8+ and these traits are likely to be influenced by a smaller number of QTL than the remainder. Judged by their overall performance across traits and computational requirements, RR

  1. Comparison of safflower oil extraction kinetics under two characteristic moisture conditions: statistical analysis of non-linear model parameters

    Directory of Open Access Journals (Sweden)

    E. Baümler

    2014-06-01

    Full Text Available In this study the kinetics of oil extraction from partially dehulled safflower seeds under two moisture conditions (7 and 9% dry basis was investigated. The extraction assays were performed using a stirred batch system, thermostated at 50 ºC, using n-hexane as solvent. The data obtained were fitted to a modified diffusion model in order to represent the extraction kinetics. The model took into account a washing and a diffusive step. Fitting parameters were compared statistically for both moisture conditions. The oil yield increased with the extraction time in both cases, although the oil was released at different rates. A comparison of the parameters showed that both the portion extracted in the washing phase and the effective diffusion coefficient were moisture-dependent. The effective diffusivities were 2.81 10-12 and 8.06 10-13 m²s-1 for moisture contents of 7% and 9%, respectively.

  2. Multivariate statistical assessments of greenhouse-gas-induced climatic change and comparison with results from general circulation models

    International Nuclear Information System (INIS)

    Schoenwiese, C.D.

    1990-01-01

    Based on univariate correction and coherence analyses, including techniques moving in time, and taking account of the physical basis of the relationships, a simple multivariate concept is presented which correlates observational climatic time series simultaneously with solar, volcanic, ENSO (El Nino/Souther Oscillation) and anthropogenic greenhouse-gas forcing. The climatic elements considered are air temperature (near the ground and stratosphere), sea surface temperature, sea level and precipitation, and cover at least the period 1881-1980 (stratospheric temperature only since 1960). The climate signal assessments which may be hypothetically attributed to the observed CO 2 or equivalent CO 2 (implying additional greenhouse gases) increase are compared with those resulting from GCM experiments. In case of the Northern hemisphere air temperature these comparisons are performed not only in respect to hemispheric and global means, but also in respect to the regional and seasonal patterns. Autocorrelations and phase shifts of the climate response to natural and anthropogenic forcing complicate the statistical assessments

  3. Does social comparison make a difference? Optimism as a moderator of the relation between comparison level and academic performance

    NARCIS (Netherlands)

    Gibbons, FX; Blanton, H; Gerrard, M; Buunk, B; Eggleston, T

    Previous research has demonstrated that poor academic performance is associated with a downward shift in preferred level of academic comparison level (ACL). The current study assessed the long-term impact of this downward shift on the academic performance of college students and also examined the

  4. An evaluation of scanpath-comparison and machine-learning classification algorithms used to study the dynamics of analogy making.

    Science.gov (United States)

    French, Robert M; Glady, Yannick; Thibaut, Jean-Pierre

    2017-08-01

    In recent years, eyetracking has begun to be used to study the dynamics of analogy making. Numerous scanpath-comparison algorithms and machine-learning techniques are available that can be applied to the raw eyetracking data. We show how scanpath-comparison algorithms, combined with multidimensional scaling and a classification algorithm, can be used to resolve an outstanding question in analogy making-namely, whether or not children's and adults' strategies in solving analogy problems are different. (They are.) We show which of these scanpath-comparison algorithms is best suited to the kinds of analogy problems that have formed the basis of much analogy-making research over the years. Furthermore, we use machine-learning classification algorithms to examine the item-to-item saccade vectors making up these scanpaths. We show which of these algorithms best predicts, from very early on in a trial, on the basis of the frequency of various item-to-item saccades, whether a child or an adult is doing the problem. This type of analysis can also be used to predict, on the basis of the item-to-item saccade dynamics in the first third of a trial, whether or not a problem will be solved correctly.

  5. Statistical study of clone survival curves after irradiation in one or two stages. Comparison and generalization of different models

    International Nuclear Information System (INIS)

    Lachet, Bernard.

    1975-01-01

    A statistical study was carried out on 208 survival curves for chlorella subjected to γ or particle radiations. The computing programmes used were written in Fortran. The different experimental causes contributing to the variance of a survival rate are analyzed and consequently the experiments can be planned. Each curve was fitted to four models by the weighted least squares method applied to non-linear functions. The validity of the fits obtained can be checked by the F test. It was possible to define the confidence and prediction zones around an adjusted curve by weighting of the residual variance, in spite of error on the doses delivered; the confidence limits can them be fixed for a dose estimated from an exact or measured survival. The four models adopted were compared for the precision of their fit (by a non-parametric simultaneous comparison test) and the scattering of their adjusted parameters: Wideroe's model gives a very good fit with the experimental points in return for a scattering of its parameters, which robs them of their presumed meaning. The principal component analysis showed the statistical equivalence of the 1 and 2 hit target models. Division of the irradiation into two doses, the first fixed by the investigator, leads to families of curves for which the equation was established from that of any basic model expressing the dose survival relationship in one-stage irradiation [fr

  6. Statistical variability comparison in MODIS and AERONET derived aerosol optical depth over Indo-Gangetic Plains using time series modeling.

    Science.gov (United States)

    Soni, Kirti; Parmar, Kulwinder Singh; Kapoor, Sangeeta; Kumar, Nishant

    2016-05-15

    A lot of studies in the literature of Aerosol Optical Depth (AOD) done by using Moderate Resolution Imaging Spectroradiometer (MODIS) derived data, but the accuracy of satellite data in comparison to ground data derived from ARrosol Robotic NETwork (AERONET) has been always questionable. So to overcome from this situation, comparative study of a comprehensive ground based and satellite data for the period of 2001-2012 is modeled. The time series model is used for the accurate prediction of AOD and statistical variability is compared to assess the performance of the model in both cases. Root mean square error (RMSE), mean absolute percentage error (MAPE), stationary R-squared, R-squared, maximum absolute percentage error (MAPE), normalized Bayesian information criterion (NBIC) and Ljung-Box methods are used to check the applicability and validity of the developed ARIMA models revealing significant precision in the model performance. It was found that, it is possible to predict the AOD by statistical modeling using time series obtained from past data of MODIS and AERONET as input data. Moreover, the result shows that MODIS data can be formed from AERONET data by adding 0.251627 ± 0.133589 and vice-versa by subtracting. From the forecast available for AODs for the next four years (2013-2017) by using the developed ARIMA model, it is concluded that the forecasted ground AOD has increased trend. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Personality assessment and model comparison with behavioral data: A statistical framework and empirical demonstration with bonobos (Pan paniscus).

    Science.gov (United States)

    Martin, Jordan S; Suarez, Scott A

    2017-08-01

    Interest in quantifying consistent among-individual variation in primate behavior, also known as personality, has grown rapidly in recent decades. Although behavioral coding is the most frequently utilized method for assessing primate personality, limitations in current statistical practice prevent researchers' from utilizing the full potential of their coding datasets. These limitations include the use of extensive data aggregation, not modeling biologically relevant sources of individual variance during repeatability estimation, not partitioning between-individual (co)variance prior to modeling personality structure, the misuse of principal component analysis, and an over-reliance upon exploratory statistical techniques to compare personality models across populations, species, and data collection methods. In this paper, we propose a statistical framework for primate personality research designed to address these limitations. Our framework synthesizes recently developed mixed-effects modeling approaches for quantifying behavioral variation with an information-theoretic model selection paradigm for confirmatory personality research. After detailing a multi-step analytic procedure for personality assessment and model comparison, we employ this framework to evaluate seven models of personality structure in zoo-housed bonobos (Pan paniscus). We find that differences between sexes, ages, zoos, time of observation, and social group composition contributed to significant behavioral variance. Independently of these factors, however, personality nonetheless accounted for a moderate to high proportion of variance in average behavior across observational periods. A personality structure derived from past rating research receives the strongest support relative to our model set. This model suggests that personality variation across the measured behavioral traits is best described by two correlated but distinct dimensions reflecting individual differences in affiliation and

  8. Dissolution comparisons using a Multivariate Statistical Distance (MSD) test and a comparison of various approaches for calculating the measurements of dissolution profile comparison.

    Science.gov (United States)

    Cardot, J-M; Roudier, B; Schütz, H

    2017-07-01

    The f 2 test is generally used for comparing dissolution profiles. In cases of high variability, the f 2 test is not applicable, and the Multivariate Statistical Distance (MSD) test is frequently proposed as an alternative by the FDA and EMA. The guidelines provide only general recommendations. MSD tests can be performed either on raw data with or without time as a variable or on parameters of models. In addition, data can be limited-as in the case of the f 2 test-to dissolutions of up to 85% or to all available data. In the context of the present paper, the recommended calculation included all raw dissolution data up to the first point greater than 85% as a variable-without the various times as parameters. The proposed MSD overcomes several drawbacks found in other methods.

  9. Decision making, cognitive distortions and emotional distress: A comparison between pathological gamblers and healthy controls.

    Science.gov (United States)

    Ciccarelli, Maria; Griffiths, Mark D; Nigro, Giovanna; Cosenza, Marina

    2017-03-01

    The etiology of problem gambling is multifaceted and complex. Among others factors, poor decision making, cognitive distortions (i.e., irrational beliefs about gambling), and emotional factors (e.g., negative mood states) appear to be among the most important factors in the development and maintenance of problem gambling. Although empirical evidence has suggested that cognitive distortions facilitate gambling and negative emotions are associated with gambling, the interplay between cognitive distortions, emotional states, and decision making in gambling remains unexplored. Pathological gamblers (N = 54) and healthy controls (N = 54) completed the South Oaks Gambling Screen (SOGS), the Iowa Gambling Task (IGT), the Gambling Related Cognitions Scale (GRCS), and the Depression Anxiety Stress Scale (DASS-21). Compared to healthy controls, pathological gamblers showed poorer decision making and reported higher scores on measures assessing cognitive distortions and emotional distress. All measures were positively associated with gambling severity. A significant negative correlation between decision making and cognitive distortions was also observed. No associations were found between poor decision making and emotional distress. Logistic regression analysis indicated that cognitive distortions, emotional distress, and poor decision making were significant predictors of problem gambling. The use of self-report measures and the absence of female participants limit the generalizability of the reported findings. The present study is the first to demonstrate the mutual influence between irrational beliefs and poor decision making, as well as the role of cognitive bias, emotional distress, and poor decision making in gambling disorder. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Geographical and Statistical Analysis on the Relationship between Land-Use Mixture and Home-Based Trip Making and More: Case of Richmond, Virginia

    Directory of Open Access Journals (Sweden)

    Yin - Shan MA

    2013-06-01

    Full Text Available Richmond, Virginia has implemented numerous mixed land-use policies to encourage non-private-vehicle commuting for decades based on the best practices of other cities and the assumption that land-use mixture would positively lead to trip reduction. This paper uses both Geographical Information Systems (GIS and statistical tools to empirically test this hypothesis. With local land use and trip making data as inputs, it first calculates two common indices of land-use mixture - entropy and dissimilarity indices, using GIS tool, supplemented by Microsoft Excel. Afterwards, it uses Statistical Package for Social Sciences (SPSS to calculate the correlation matrices among land-use mixture indices, socioeconomic variables, and home-based work/other trip rates, followed by a series of regression model runs on these variables. Through this study, it has been found that land-use mixture has some but weak effects on home-based work trip rate, and virtually no effects on home-based other trip rate. In contrast, socioeconomic variables, especially auto ownership, have larger effects on home-based trip making.

  11. Using machine learning, neural networks and statistics to predict bankruptcy

    NARCIS (Netherlands)

    Pompe, P.P.M.; Feelders, A.J.; Feelders, A.J.

    1997-01-01

    Recent literature strongly suggests that machine learning approaches to classification outperform "classical" statistical methods. We make a comparison between the performance of linear discriminant analysis, classification trees, and neural networks in predicting corporate bankruptcy. Linear

  12. A Comparison of Adults' Responses to Collage versus Drawing in an Initial Art-Making Session

    Science.gov (United States)

    Raffaelli, Teresa; Hartzell, Elizabeth

    2016-01-01

    The goal of this systematic comparison of collage and drawing was to contribute to the sparse body of literature on the way individuals might respond to two materials commonly used in art therapy. Eight graduate and undergraduate university students who identified as non-artists completed two tasks, one using drawing materials and one using…

  13. A PERFORMANCE COMPARISON BETWEEN ARTIFICIAL NEURAL NETWORKS AND MULTIVARIATE STATISTICAL METHODS IN FORECASTING FINANCIAL STRENGTH RATING IN TURKISH BANKING SECTOR

    Directory of Open Access Journals (Sweden)

    MELEK ACAR BOYACIOĞLU

    2013-06-01

    Full Text Available Financial strength rating indicates the fundamental financial strength of a bank. The aim of financial strength rating is to measure a bank’s fundamental financial strength excluding the external factors. External factors can stem from the working environment or can be linked with the outside protective support mechanisms. With the evaluation, the rating of a bank free from outside supportive factors is being sought. Also the financial fundamental, franchise value, the variety of assets and working environment of a bank are being evaluated in this context. In this study, a model has been developed in order to predict the financial strength rating of Turkish banks. The methodology of this study is as follows: Selecting variables to be used in the model, creating a data set, choosing the techniques to be used and the evaluation of classification success of the techniques. It is concluded that the artificial neural network system shows a better performance in terms of classification of financial strength rating in comparison to multivariate statistical methods in the raining set. On the other hand, there is no meaningful difference could be found in the validation set in which the prediction performances of the employed techniques are tested.

  14. A comparison of online versus face-to-face teaching delivery in statistics instruction for undergraduate health science students.

    Science.gov (United States)

    Lu, Fletcher; Lemonde, Manon

    2013-12-01

    The objective of this study was to assess if online teaching delivery produces comparable student test performance as the traditional face-to-face approach irrespective of academic aptitude. This study involves a quasi-experimental comparison of student performance in an undergraduate health science statistics course partitioned in two ways. The first partition involves one group of students taught with a traditional face-to-face classroom approach and the other through a completely online instructional approach. The second partition of the subjects categorized the academic aptitude of the students into groups of higher and lower academically performing based on their assignment grades during the course. Controls that were placed on the study to reduce the possibility of confounding variables were: the same instructor taught both groups covering the same subject information, using the same assessment methods and delivered over the same period of time. The results of this study indicate that online teaching delivery is as effective as a traditional face-to-face approach in terms of producing comparable student test performance but only if the student is academically higher performing. For academically lower performing students, the online delivery method produced significantly poorer student test results compared to those lower performing students taught in a traditional face-to-face environment.

  15. Statistical comparison of models for estimating the monthly average daily diffuse radiation at a subtropical African site

    International Nuclear Information System (INIS)

    Bashahu, M.

    2003-01-01

    Nine correlations have been developed in this paper to estimate the monthly average diffuse radiation for Dakar, Senegal. A 16-year period data on the global (H) and diffuse (H d ) radiation, together with data on the bright sunshine hours (N), the fraction of the sky's (Ne/8), the water vapour pressure in the air (e) and the ambient temperature (T) have been used for that purpose. A model inter-comparison based on the MBE, RMSE and t statistical tests has shown that estimates in any of the obtained correlations are not significantly different from their measured counterparts, thus all the nine models are recommended for the aforesaid location. Three of them should be particularly selected for their simplicity, universal applicability and high accuracy. Those are simple linear correlations between K d and N/N d , Ne/8 or K t . Even presenting adequate performance, the remaining correlations are either simple but less accurate, or multiple or nonlinear regressions needing one or two input variables. (author)

  16. Statistical comparison of models for estimating the monthly average daily diffuse radiation at a subtropical African site

    Energy Technology Data Exchange (ETDEWEB)

    Bashahu, M. [University of Burundi, Bujumbura (Burundi). Institute of Applied Pedagogy, Department of Physics and Technology

    2003-07-01

    Nine correlations have been developed in this paper to estimate the monthly average diffuse radiation for Dakar, Senegal. A 16-year period data on the global (H) and diffuse (H{sub d}) radiation, together with data on the bright sunshine hours (N), the fraction of the sky's (Ne/8), the water vapour pressure in the air (e) and the ambient temperature (T) have been used for that purpose. A model inter-comparison based on the MBE, RMSE and t statistical tests has shown that estimates in any of the obtained correlations are not significantly different from their measured counterparts, thus all the nine models are recommended for the aforesaid location. Three of them should be particularly selected for their simplicity, universal applicability and high accuracy. Those are simple linear correlations between K{sub d} and N/N{sub d}, Ne/8 or K{sub t}. Even presenting adequate performance, the remaining correlations are either simple but less accurate, or multiple or nonlinear regressions needing one or two input variables. (author)

  17. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  18. An international comparison of legal frameworks for supported and substitute decision-making in mental health services.

    Science.gov (United States)

    Davidson, Gavin; Brophy, Lisa; Campbell, Jim; Farrell, Susan J; Gooding, Piers; O'Brien, Ann-Marie

    2016-01-01

    There have been important recent developments in law, research, policy and practice relating to supporting people with decision-making impairments, in particular when a person's wishes and preferences are unclear or inaccessible. A driver in this respect is the United Nations Convention on the Rights of Persons with Disabilities (CRPD); the implications of the CRPD for policy and professional practices are currently debated. This article reviews and compares four legal frameworks for supported and substitute decision-making for people whose decision-making ability is impaired. In particular, it explores how these frameworks may apply to people with mental health problems. The four jurisdictions are: Ontario, Canada; Victoria, Australia; England and Wales, United Kingdom (UK); and Northern Ireland, UK. Comparisons and contrasts are made in the key areas of: the legal framework for supported and substitute decision-making; the criteria for intervention; the assessment process; the safeguards; and issues in practice. Thus Ontario has developed a relatively comprehensive, progressive and influential legal framework over the past 30 years but there remain concerns about the standardisation of decision-making ability assessments and how the laws work together. In Australia, the Victorian Law Reform Commission (2012) has recommended that the six different types of substitute decision-making under the three laws in that jurisdiction, need to be simplified, and integrated into a spectrum that includes supported decision-making. In England and Wales the Mental Capacity Act 2005 has a complex interface with mental health law. In Northern Ireland it is proposed to introduce a new Mental Capacity (Health, Welfare and Finance) Bill that will provide a unified structure for all substitute decision-making. The discussion will consider the key strengths and limitations of the approaches in each jurisdiction and identify possible ways that further progress can be made in law, policy

  19. Decision making process: a comparison between the public and private sector

    OpenAIRE

    Igreja, Arthur Schuler da

    2014-01-01

    The problem of decision making, its mechanisms and consequences is the very core of management, it is virtually impossible to separate the act of manage from this knowledge area. As defined by Herbert Simon – "decision making" as though it were synonymous with "managing". A decision is a selection made by an individual regarding a choice of a conclusion about a situation. This represents a course of behavior pertaining to what must be done or what must not be done. A decision is the point at ...

  20. How do american students measure up? Making sense of international comparisons.

    Science.gov (United States)

    Koretz, Daniel

    2009-01-01

    In response to frequent news media reports about how poorly American students fare compared with their peers abroad, Daniel Koretz takes a close look at what these comparisons say, and do not say, about the achievement of U.S. high school students. He stresses that the comparisons do not provide what many observers of education would like: unambiguous information about the effectiveness of American high schools compared with those in other nations. Koretz begins by describing the. two principal international student comparisons-the Trends in International Mathematics and Science Study (TIMSS) and the Program for International Student Assessment (PISA). Both assessments, he stresses, reflect the performance of students several years before they complete high school. PISA, which targets fifteen-year-old students, measures students' abilities to apply what they have learned in school to real-world problems. By contrast, TIMSS tests fourth and eighth graders. Unlike PISA, TIMSS follows the school curriculum closely. Because the findings of the two tests are sometimes inconsistent, Koretz stresses the importance of considering data from both sources. He cautions against comparing U.S. students with an "international average," which varies widely from survey to survey depending on which countries participate, and recommends instead comparing them with students in other nations that are similar to the United States or that are particularly high-achieving. Many observers, says Koretz, speculate that the lackluster average performance of American students in international comparisons arises because many, especially minority and low-income U.S. students, attend low-performing schools. But both TIMSS and PISA, he says, show that the performance of American students on the exams is not much more variable than that of students in countries that are socially more homogeneous or that have more equitable educational systems. Koretz emphasizes that the international comparisons

  1. CT of the chest with model-based, fully iterative reconstruction: comparison with adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Ichikawa, Yasutaka; Kitagawa, Kakuya; Nagasawa, Naoki; Murashima, Shuichi; Sakuma, Hajime

    2013-08-09

    The recently developed model-based iterative reconstruction (MBIR) enables significant reduction of image noise and artifacts, compared with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP). The purpose of this study was to evaluate lesion detectability of low-dose chest computed tomography (CT) with MBIR in comparison with ASIR and FBP. Chest CT was acquired with 64-slice CT (Discovery CT750HD) with standard-dose (5.7 ± 2.3 mSv) and low-dose (1.6 ± 0.8 mSv) conditions in 55 patients (aged 72 ± 7 years) who were suspected of lung disease on chest radiograms. Low-dose CT images were reconstructed with MBIR, ASIR 50% and FBP, and standard-dose CT images were reconstructed with FBP, using a reconstructed slice thickness of 0.625 mm. Two observers evaluated the image quality of abnormal lung and mediastinal structures on a 5-point scale (Score 5 = excellent and score 1 = non-diagnostic). The objective image noise was also measured as the standard deviation of CT intensity in the descending aorta. The image quality score of enlarged mediastinal lymph nodes on low-dose MBIR CT (4.7 ± 0.5) was significantly improved in comparison with low-dose FBP and ASIR CT (3.0 ± 0.5, p = 0.004; 4.0 ± 0.5, p = 0.02, respectively), and was nearly identical to the score of standard-dose FBP image (4.8 ± 0.4, p = 0.66). Concerning decreased lung attenuation (bulla, emphysema, or cyst), the image quality score on low-dose MBIR CT (4.9 ± 0.2) was slightly better compared to low-dose FBP and ASIR CT (4.5 ± 0.6, p = 0.01; 4.6 ± 0.5, p = 0.01, respectively). There were no significant differences in image quality scores of visualization of consolidation or mass, ground-glass attenuation, or reticular opacity among low- and standard-dose CT series. Image noise with low-dose MBIR CT (11.6 ± 1.0 Hounsfield units (HU)) were significantly lower than with low-dose ASIR (21.1 ± 2.6 HU, p standard-dose FBP CT (16.6 ± 2.3 HU, p 70%, MBIR can provide

  2. IT Strategy and Decision-Making: A Comparison of Four Universities

    Science.gov (United States)

    Wilmore, Andrew

    2014-01-01

    Universities are increasingly dependent on information technology (IT) to support delivery of their objectives. It is crucial, therefore, that the IT investments made lead to successful outcomes. This study analyses the governance structures and decision-making processes used to approve and prioritise IT projects. Factors influencing an…

  3. Fundamental and empirical rheological behaviour of wheat flour doughs and comparison with bread making performance

    NARCIS (Netherlands)

    Janssen, A. M.; vanVliet, T; Vereijken, JM

    The rheological characteristics of wheat flour doughs from the cultivars Obelisk and Katepwa and of biscuit flour doughs, and also of biscuit flour doughs containing glutens isolated from cv. Obelisk and cv. Katepwa flour, were compared and discussed in relation to bread making performance. Four

  4. Data-based decision making in the Netherlands and England : A comparison

    NARCIS (Netherlands)

    Ebbeler, Johanna; Schildkamp, Kim; Downey, Christopher

    2012-01-01

    Data-based decision making is receiving increased attention in education, not only in the USA but also in Europe. This paper describes findings from an international comparative study that examined data use by school staff in secondary education in the Netherlands and in England. Eighty six

  5. Students' Reasoning and Decision Making about a Socioscientific Issue: A Cross-Context Comparison

    Science.gov (United States)

    Lee, Yeung Chung; Grace, Marcus

    2012-01-01

    It has been argued that decision making about socioscientific issue (SSIs) necessitates informal reasoning, which involves multiperspective thinking and moral judgment. This study extends the scope of the literature concerning students' reasoning on SSIs to a cross-contextual study by comparing decisions made on avian flu by 12-13-year-old Chinese…

  6. Pitfalls in the statistical examination and interpretation of the correspondence between physician and patient satisfaction ratings and their relevance for shared decision making research

    Science.gov (United States)

    2011-01-01

    Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337

  7. [Risk stratification of patients with diabetes mellitus undergoing coronary artery bypass grafting--a comparison of statistical methods].

    Science.gov (United States)

    Arnrich, B; Albert, A; Walter, J

    2006-01-01

    Among the coronary bypass patients from our Datamart database, we found a prevalence of 29.6% of diagnosed diabetics. 5.2% of the patients without a diagnosis of diabetes mellitus and a fasting plasma glucose level > 125 mg/dl were defined as undiagnosed diabetics. The objective of this paper was to compare univariate methods and techniques for risk stratification to determine, whether undiagnosed diabetes is per se a risk factor for increased ventilation time and length of ICU stay, and for increased prevalence of resuscitation, reintubation and 30-d mortality for diabetics in heart surgery. Univariate comparisons reveals that undiagnosed diabetics needed resuscitation significantly more often and had an increased ventilation time, while the length of ICU stay was significantly reduced. The significantly different distribution between the diabetics groups of 11 from 32 attributes examined, demands the use of methods for risk stratification. Both risk adjusted methods regression and matching confirm that undiagnosed diabetics had an increased ventilation time and an increased prevalence of resuscitation, while the length of ICU stay was not significantly reduced. A homogeneous distribution of the patient characteristics in the two diabetics groups could be achieved through a statistical matching method using the propensity score. In contrast to the regression analysis, a significantly increased prevalence of reintubation in undiagnosed diabetics was found. Based on an example of undiagnosed diabetics in heart surgery, the presented study reveals the necessity and the possibilities of techniques for risk stratification in retrospective analysis and shows how the potential of data collection from daily clinical practice can be used in an effective way.

  8. Statistical comparison of spectral and biochemical measurements on an example of Norway spruce stands in the Ore Mountains, Czech Republic

    Directory of Open Access Journals (Sweden)

    Markéta Potůčková

    2016-07-01

    Full Text Available The physiological status of vegetation and changes thereto can be monitored by means of biochemical analysis of collected samples as well as by means of spectroscopic measurements either on the leaf level, using field (or laboratory spectroradiometers or on the canopy level, applying hyperspectral airborne or spaceborne image data. The presented study focuses on the statistical comparison and ascertainment of relations between three datasets collected from selected Norway spruce forest stands in the Ore Mountains, Czechia. The data sets comprise i photosynthetic pigments (chlorophylls, carotenoids and water content of 495 samples collected from 55 trees from three different vertical levels and the first three needle age classes, ii the spectral reflectance of the same samples measured with an ASD Field Spec 4 Wide-Res spectroradiometer equipped with a plant contact probe, iii an airborne hyperspecral image acquired with an Apex sensor. The datasets cover two localities in the Ore Mountains that were affected differently by acid deposits in the 1970s and 1980s. A one-way analysis of variance (ANOVA, Tukey’s honest significance test, hot spot analysis and linear regression were applied either on the original measurements (the content of leaf compounds and reflectance spectra or derived values, i.e., selected spectral indices. The results revealed a generally low correlation between the photosynthetic pigments, water content and spectral measurement. The results of the ANOVA showed significant differences between sites (model areas only in the case of the leaf compound dataset. Differences between the stands on various levels of significance exist in all three datasets and are explained in detail. The study also proved that the vertical gradient of the biochemical and biophysical parameters in the canopy play a role when the optical properties of the forest stands are modelled.

  9. Head-to-head comparison of adaptive statistical and model-based iterative reconstruction algorithms for submillisievert coronary CT angiography.

    Science.gov (United States)

    Benz, Dominik C; Fuchs, Tobias A; Gräni, Christoph; Studer Bruengger, Annina A; Clerc, Olivier F; Mikulicic, Fran; Messerli, Michael; Stehli, Julia; Possner, Mathias; Pazhenkottil, Aju P; Gaemperli, Oliver; Kaufmann, Philipp A; Buechel, Ronny R

    2018-02-01

    Iterative reconstruction (IR) algorithms allow for a significant reduction in radiation dose of coronary computed tomography angiography (CCTA). We performed a head-to-head comparison of adaptive statistical IR (ASiR) and model-based IR (MBIR) algorithms to assess their impact on quantitative image parameters and diagnostic accuracy for submillisievert CCTA. CCTA datasets of 91 patients were reconstructed using filtered back projection (FBP), increasing contributions of ASiR (20, 40, 60, 80, and 100%), and MBIR. Signal and noise were measured in the aortic root to calculate signal-to-noise ratio (SNR). In a subgroup of 36 patients, diagnostic accuracy of ASiR 40%, ASiR 100%, and MBIR for diagnosis of coronary artery disease (CAD) was compared with invasive coronary angiography. Median radiation dose was 0.21 mSv for CCTA. While increasing levels of ASiR gradually reduced image noise compared with FBP (up to - 48%, P ASiR (-59% compared with ASiR 100%; P ASiR 40% and ASiR 100% resulted in substantially lower diagnostic accuracy to detect CAD as diagnosed by invasive coronary angiography compared with MBIR: sensitivity and specificity were 100 and 37%, 100 and 57%, and 100 and 74% for ASiR 40%, ASiR 100%, and MBIR, respectively. MBIR offers substantial noise reduction with increased SNR, paving the way for implementation of submillisievert CCTA protocols in clinical routine. In contrast, inferior noise reduction by ASiR negatively affects diagnostic accuracy of submillisievert CCTA for CAD detection. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  10. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  11. Decision making, cognitive distortions and emotional distress: a comparison between pathological gamblers and healthy controls

    OpenAIRE

    Ciccarelli, M; Griffiths, MD; Nigro, G; Cosenza, M

    2017-01-01

    Background and objectives: The etiology of problem gambling is multifaceted and complex. Among others factors, poor decision making, cognitive distortions (i.e., irrational beliefs about gambling), and emotional factors (e.g., negative mood states) appear to be among the most important factors in the development and maintenance of problem gambling. Although empirical evidence has suggested that cognitive distortions facilitate gambling and negative emotions are associated with gambling, the i...

  12. Primates´ socio-cognitive abilities: What kind of comparisons makes sense?

    DEFF Research Database (Denmark)

    Byrnit, Jill

    2015-01-01

    Referential gestures are of pivotal importance to the human species. We effortlessly make use of each others´ referential gestures to attend to the same things, and our ability to use these gestures show themselves from very early in life. Almost 20 years ago, James Anderson and colleagues...... are testing animals on experimental paradigms that do not take into account what are the challenges of their natural habitat....

  13. Cooling tower make-up water processing for nuclear power plants: a comparison

    Energy Technology Data Exchange (ETDEWEB)

    Andres, O; Flunkert, F; Hampel, G; Schiffers, A [Rheinisch-Westfaelisches Elektrizitaetswerk A.G., Essen (Germany, F.R.)

    1977-01-01

    In water-cooled nuclear power plants, 1 to 2% of the total investment costs go to cooling tower make-up water processing. The crude water taken from rivers or stationary waters for cooling must be sufficiently purified regarding its content of solids, carbonate hardness and corrosive components so as to guarantee an operation free of disturbances. At the same time, the processing methods must be selected for operational-economic reasons in such a manner that waste water and waste problems are kept small regarding environmental protection. The various parameters described have a decisive influence on the processing methods of the crude water, individual processes (filtration, sedimentation, decarbonization) are described, circuit possibilities for cooling water systems are compared and the various processes are analyzed and compared with regard to profitableness and environmental compatability.

  14. New adaptive statistical iterative reconstruction ASiR-V: Assessment of noise performance in comparison to ASiR.

    Science.gov (United States)

    De Marco, Paolo; Origgi, Daniela

    2018-03-01

    To assess the noise characteristics of the new adaptive statistical iterative reconstruction (ASiR-V) in comparison to ASiR. A water phantom was acquired with common clinical scanning parameters, at five different levels of CTDI vol . Images were reconstructed with different kernels (STD, SOFT, and BONE), different IR levels (40%, 60%, and 100%) and different slice thickness (ST) (0.625 and 2.5 mm), both for ASiR-V and ASiR. Noise properties were investigated and noise power spectrum (NPS) was evaluated. ASiR-V significantly reduced noise relative to FBP: noise reduction was in the range 23%-60% for a 0.625 mm ST and 12%-64% for the 2.5 mm ST. Above 2 mGy, noise reduction for ASiR-V had no dependence on dose. Noise reduction for ASIR-V has dependence on ST, being greater for STD and SOFT kernels at 2.5 mm. For the STD kernel ASiR-V has greater noise reduction for both ST, if compared to ASiR. For the SOFT kernel, results varies according to dose and ST, while for BONE kernel ASIR-V shows less noise reduction. NPS for CT Revolution has dose dependent behavior at lower doses. NPS for ASIR-V and ASiR is similar, showing a shift toward lower frequencies as the IR level increases for STD and SOFT kernels. The NPS is different between ASiR-V and ASIR with BONE kernel. NPS for ASiR-V appears to be ST dependent, having a shift toward lower frequencies for 2.5 mm ST. ASiR-V showed greater noise reduction than ASiR for STD and SOFT kernels, while keeping the same NPS. For the BONE kernel, ASiR-V presents a completely different behavior, with less noise reduction and modified NPS. Noise properties of the ASiR-V are dependent on reconstruction slice thickness. The noise properties of ASiR-V suggest the need for further measurements and efforts to establish new CT protocols to optimize clinical imaging. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Arboreal biomass estimation: a comparison between neural networks and statistical methods; Estimativa de biomassa arborea: uma comparacao entre metodos estatisticos e redes neurais

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Arthur C.; Barros, Paulo L.C.; Monteiro, Jose H.A.; Rocha, Brigida R.P. [Universidade Federal do Para (DEEC/UFPA), Belem, PA (Brazil). Dept. de Engenharia Eletrica e Computacao. Grupo de Pesquisa ENERBIO], e-mails: arthur@ufpa.br, jhumberto01@yahoo.com.br, brigida@ufpa.br, paulo.contente@ufra.edu.br

    2006-07-01

    The current methodologies for calculating the volume of biomass and the consequent potential energy widely used in forest inventories, based primarily in statistical methodology to obtain their results. However, more recent techniques, based on the ability of nonlinear mappings, offered by artificial neural networks, have been used successfully in several areas of technology, with superior performance. This work shows a comparison between the statistical model to estimate the volume of trees and a model based on neural networks, which can be used with advantage for this activity related with biomass energy planning.

  16. HOMA-IR and QUICKI: decide on a general standard instead of making further comparisons.

    Science.gov (United States)

    Rössner, Sophia M; Neovius, Martin; Mattsson, Anna; Marcus, Claude; Norgren, Svante

    2010-11-01

    To limit further comparisons between the two fasting indices Homeostasis Model Assessment for Insulin Resistance (HOMA-IR) and Quantitative Insulin Sensitivity Check Index (QUICKI), and to examine their robustness in assessing insulin sensitivity. A total of 191 obese children and adolescents (age 13.9 ± 2.9 years, BMI SDS 6.1 ± 1.6), who had undergone a Frequently Sampled Intravenous Glucose Tolerance Test (FSIVGTT), were included. Receiver operating characteristic curve (ROC) analysis was used to compare indices in detecting insulin resistance and Bland-Altman plots to investigate agreement between three consecutive fasting samples when compared to using single samples. ROC analysis showed that the diagnostic accuracy was identical for QUICKI and HOMA-IR [area under the curve (AUC) boys 0.80, 95%CI 0.70-0.89; girls 0.80, 0.71-0.88], while insulin had a nonsignificantly lower AUC (boys 0.76, 0.66-0.87; girls 0.75, 0.66-0.84). Glucose did not perform better than chance as a diagnostic test (boys 0.47, 0.34-0.60; girls 0.57, 0.46-0.68). Indices varied with consecutive sampling, mainly attributable to fasting insulin variations (mean maximum difference in HOMA-IR -0.8; -0.9 to -0.7). Using both HOMA-IR and QUICKI in further studies is superfluous as these indices function equally well as predictors of the FSIVGTT sensitivity index. Focus should be on establishing a general standard for research and clinical purposes. © 2010 The Author(s)/Journal Compilation © 2010 Foundation Acta Paediatrica.

  17. Practical statistics in pain research.

    Science.gov (United States)

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  18. Making predictions of mangrove deforestation: a comparison of two methods in Kenya.

    Science.gov (United States)

    Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A

    2013-11-01

    Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.

  19. The adaptive statistical iterative reconstruction-V technique for radiation dose reduction in abdominal CT: comparison with the adaptive statistical iterative reconstruction technique.

    Science.gov (United States)

    Kwon, Heejin; Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun

    2015-10-01

    To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. 27 consecutive patients (mean body mass index: 23.55 kg m(-2) underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19-49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. This study represents the first clinical research experiment to use ASIR-V, the newest version of

  20. Decision making, central coherence and set-shifting: a comparison between Binge Eating Disorder, Anorexia Nervosa and Healthy Controls.

    Science.gov (United States)

    Aloi, Matteo; Rania, Marianna; Caroleo, Mariarita; Bruni, Antonella; Palmieri, Antonella; Cauteruccio, Maria Antonella; De Fazio, Pasquale; Segura-García, Cristina

    2015-01-24

    Several studies have investigated the cognitive profile in patients with Anorexia Nervosa (AN) and Bulimia Nervosa (BN); on the contrary few studies have evaluated it in patients with Binge Eating Disorder (BED). The purpose of this study was to compare decision making, central coherence and set-shifting between BED and AN patients. A battery of neuropsychological tests including the Iowa Gambling Task (IGT), the Rey-Osterrieth Complex Figure Test (RCFT), the Wisconsin Card Sorting Test (WCST), the Trial Making Task (TMT) and the Hayling Sentence Completion Task (HSCT) were administered in a sample of 135 women (45 AN, 45 BED, 45 Healthy Controls [HC]). Furthermore, Beck Depression Inventory (BDI) was administered to evaluate depressive symptoms. Years of education, age, Body Mass Index (BMI) and depression severity were considered as covariates in statistical analyses. BED and AN patients showed high rates of cognitive impairment compared to HC on the domains investigated; furthermore, the cognitive profile of BED patients was characterised by poorer decision making and cognitive flexibility compared to patients with AN. Cognitive performance was strongly associated with depressive symptoms. In the present sample, two different neurocognitive profiles emerged: a strong cognitive rigidity and a central coherence based on the details was predominant in patients with AN, while a lack of attention and difficulty in adapting to changes in a new situation seemed to better describe patients with BED. The knowledge of the different cognitive profiles of EDs patients may be important for the planning their psychotherapeutic intervention.

  1. Cross-cultural comparison of concrete recycling decision-making and implementation in construction industry.

    Science.gov (United States)

    Tam, Vivian W Y; Tam, Leona; Le, Khoa N

    2010-02-01

    Waste management is pressing very hard with alarming signals in construction industry. Concrete waste constituents major proportions of construction and demolition waste of 81% in Australia. To minimize concrete waste generated from construction activities, recycling concrete waste is one of the best methods to conserve the environment. This paper investigates concrete recycling implementation in construction. Japan is a leading country in recycling concrete waste, which has been implementing 98% recycling and using it for structural concrete applications. Hong Kong is developing concrete recycling programs for high-grade applications. Australia is making relatively slow progress in implementing concrete recycling in construction. Therefore, empirical studies in Australia, Hong Kong, and Japan were selected in this paper. A questionnaire survey and structured interviews were conducted. Power spectrum was used for analysis. It was found that "increasing overall business competitiveness and strategic business opportunities" was considered as the major benefit for concrete recycling from Hong Kong and Japanese respondents, while "rising concrete recycling awareness such as selecting suitable resources, techniques and training and compliance with regulations" was considered as the major benefit from Australian respondents. However, "lack of clients' support", "increase in management cost" and "increase in documentation workload, such as working documents, procedures and tools" were the major difficulties encountered from Australian, Hong Kong, and Japanese respondents, respectively. To improve the existing implementation, "inclusion of concrete recycling evaluation in tender appraisal" and "defining clear legal evaluation of concrete recycling" were major recommendations for Australian and Hong Kong, and Japanese respondents, respectively.

  2. The comparison of the energy performance of hotel buildings using PROMETHEE decision-making method

    Directory of Open Access Journals (Sweden)

    Vujosevic Milica L.

    2016-01-01

    Full Text Available Annual energy performance of the atrium type hotel buildings in Belgrade climate conditions are analysed in this paper. The objective is to examine the impact of the atrium on the hotel building’s energy needs for space heating and cooling, thus establishing the best design among four proposed alternatives of the hotels with atrium. The energy performance results are obtained using EnergyPlus simulation engine, taking into account Belgrade climate data and thermal comfort parameters. The selected results are compared and the hotels are ranked according to certain criteria. Decision-making process that resulted in the ranking of the proposed alternatives is conducted using PROMETHEE method and Borda model. The methodological approach in this research includes the creation of a hypothetical model of an atrium type hotel building, numerical simulation of energy performances of four design alternatives of the hotel building with an atrium, comparative analysis of the obtained results and ranking of the proposed alternatives from the building’s energy performance perspective. The main task of the analysis is to examine the influence of the atrium, with both its shape and position, on the energy performance of the hotel building. Based on the results of the research it can be to determine the most energy efficient model of the hotel building with atrium for Belgrade climate condition areas. [Projekat Ministarstva nauke Republike Srbije: Spatial, Environmental, Energy and Social aspects of the Developing Settlements and Climate Change - Mutual Impacts

  3. Cross-cultural comparison of concrete recycling decision-making and implementation in construction industry

    International Nuclear Information System (INIS)

    Tam, Vivian W.Y.; Tam, Leona; Le, Khoa N.

    2010-01-01

    Waste management is pressing very hard with alarming signals in construction industry. Concrete waste constituents major proportions of construction and demolition waste of 81% in Australia. To minimize concrete waste generated from construction activities, recycling concrete waste is one of the best methods to conserve the environment. This paper investigates concrete recycling implementation in construction. Japan is a leading country in recycling concrete waste, which has been implementing 98% recycling and using it for structural concrete applications. Hong Kong is developing concrete recycling programs for high-grade applications. Australia is making relatively slow progress in implementing concrete recycling in construction. Therefore, empirical studies in Australia, Hong Kong, and Japan were selected in this paper. A questionnaire survey and structured interviews were conducted. Power spectrum was used for analysis. It was found that 'increasing overall business competitiveness and strategic business opportunities' was considered as the major benefit for concrete recycling from Hong Kong and Japanese respondents, while 'rising concrete recycling awareness such as selecting suitable resources, techniques and training and compliance with regulations' was considered as the major benefit from Australian respondents. However, 'lack of clients' support', 'increase in management cost' and 'increase in documentation workload, such as working documents, procedures and tools' were the major difficulties encountered from Australian, Hong Kong, and Japanese respondents, respectively. To improve the existing implementation, 'inclusion of concrete recycling evaluation in tender appraisal' and 'defining clear legal evaluation of concrete recycling' were major recommendations for Australian and Hong Kong, and Japanese respondents, respectively.

  4. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  5. Predicting Acquisition of Learning Outcomes: A Comparison of Traditional and Activity-Based Instruction in an Introductory Statistics Course.

    Science.gov (United States)

    Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.

    The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…

  6. Small Sample Statistics for Incomplete Nonnormal Data: Extensions of Complete Data Formulae and a Monte Carlo Comparison

    Science.gov (United States)

    Savalei, Victoria

    2010-01-01

    Incomplete nonnormal data are common occurrences in applied research. Although these 2 problems are often dealt with separately by methodologists, they often cooccur. Very little has been written about statistics appropriate for evaluating models with such data. This article extends several existing statistics for complete nonnormal data to…

  7. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  8. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  9. Primates' Socio-Cognitive Abilities: What Kind of Comparisons Makes Sense?

    Science.gov (United States)

    Byrnit, Jill T

    2015-09-01

    Referential gestures are of pivotal importance to the human species. We effortlessly make use of each others' referential gestures to attend to the same things, and our ability to use these gestures show themselves from very early in life. Almost 20 years ago, James Anderson and colleagues presented an experimental paradigm with which to examine the use of referential gestures in non-human primates: the object-choice task. Since then, numerous object-choice studies have been made, not only with primates but also with a range of other animal taxa. Surprisingly, several non-primate species appear to perform better in the object-choice task than primates do. Different hypotheses have been offered to explain the results. Some of these have employed generalizations about primates or subsets of primate taxa that do not take into account the unparalleled diversity that exists between species within the primate order on parameters relevant to the requirements of the object-choice task, such as social structure, feeding ecology, and general morphology. To examine whether these broad primate generalizations offer a fruitful organizing framework within which to interpret the results, a review was made of all published primate results on the use of gazing and glancing cues with species ordered along the primate phylogenetic tree. It was concluded that differences between species may be larger than differences between ancestry taxa, and it is suggested that we need to start rethinking why we are testing animals on experimental paradigms that do not take into account what are the challenges of their natural habitat.

  10. What happens if we compare chopsticks with forks? The impact of making inappropriate comparisons in cross-cultural research.

    Science.gov (United States)

    Chen, Fang Fang

    2008-11-01

    It is a common practice to export instruments developed in one culture to another. Little is known about the consequences of making inappropriate comparisons in cross-cultural research. Several studies were conducted to fill in this gap. Study 1 examined the impact of lacking factor loading invariance on regression slope comparisons. When factor loadings of a predictor are higher in the reference group (e.g., United States), for which the scale was developed, than in the focal group (e.g., China), into which the scale was imported, the predictive relationship (e.g., self-esteem predicting life satisfaction) is artificially stronger in the reference group but weaker in the focal group, creating a bogus interaction effect of predictor by group (e.g., self-esteem by culture); the opposite pattern is found when the reference group has higher loadings in an outcome variable. Studies 2 and 3 examined the impact of lacking loading and intercept (i.e., point of origin) invariance on factor means, respectively. When the reference group has higher loadings or intercepts, the mean is overestimated in that group but underestimated in the focal group, resulting in a pseudo group difference. (c) 2008 APA, all rights reserved.

  11. Statistical Multiplicity in Systematic Reviews of Anaesthesia Interventions: A Quantification and Comparison between Cochrane and Non-Cochrane Reviews

    DEFF Research Database (Denmark)

    Imberger, Georgina; Vejlby, Alexandra Hedvig Damgaard; Hansen, Sara Bohnstedt

    2011-01-01

    Systematic reviews with meta-analyses often contain many statistical tests. This multiplicity may increase the risk of type I error. Few attempts have been made to address the problem of statistical multiplicity in systematic reviews. Before the implications are properly considered, the size...... of systematic reviews and aimed to assess whether this quantity is different in Cochrane and non-Cochrane reviews....... of the issue deserves clarification. Because of the emphasis on bias evaluation and because of the editorial processes involved, Cochrane reviews may contain more multiplicity than their non-Cochrane counterparts. This study measured the quantity of statistical multiplicity present in a population...

  12. Characterizing the Joint Effect of Diverse Test-Statistic Correlation Structures and Effect Size on False Discovery Rates in a Multiple-Comparison Study of Many Outcome Measures

    Science.gov (United States)

    Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James

    2011-01-01

    In their 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report the results of a simulation assessing the robustness of their adaptive step-down procedure (GBS) for controlling the false discovery rate (FDR) when normally distributed test statistics are serially correlated. In this study we extend the investigation to the case of multiple comparisons involving correlated non-central t-statistics, in particular when several treatments or time periods are being compared to a control in a repeated-measures design with many dependent outcome measures. In addition, we consider several dependence structures other than serial correlation and illustrate how the FDR depends on the interaction between effect size and the type of correlation structure as indexed by Foerstner s distance metric from an identity. The relationship between the correlation matrix R of the original dependent variables and R, the correlation matrix of associated t-statistics is also studied. In general R depends not only on R, but also on sample size and the signed effect sizes for the multiple comparisons.

  13. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  14. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  15. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    Science.gov (United States)

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  16. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  17. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    Science.gov (United States)

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression

  18. Statistical methods for conducting agreement (comparison of clinical tests) and precision (repeatability or reproducibility) studies in optometry and ophthalmology.

    Science.gov (United States)

    McAlinden, Colm; Khadka, Jyoti; Pesudovs, Konrad

    2011-07-01

    The ever-expanding choice of ocular metrology and imaging equipment has driven research into the validity of their measurements. Consequently, studies of the agreement between two instruments or clinical tests have proliferated in the ophthalmic literature. It is important that researchers apply the appropriate statistical tests in agreement studies. Correlation coefficients are hazardous and should be avoided. The 'limits of agreement' method originally proposed by Altman and Bland in 1983 is the statistical procedure of choice. Its step-by-step use and practical considerations in relation to optometry and ophthalmology are detailed in addition to sample size considerations and statistical approaches to precision (repeatability or reproducibility) estimates. Ophthalmic & Physiological Optics © 2011 The College of Optometrists.

  19. Risk Factors for Visual Field Progression in the Groningen Longitudinal Glaucoma Study : A Comparison of Different Statistical Approaches

    NARCIS (Netherlands)

    Wesselink, Christiaan; Marcus, Michael W.; Jansonius, Nomdo M.

    2012-01-01

    Purpose: To identify risk factors for visual field progression in glaucoma and to compare different statistical approaches with this risk factor analysis. Patients and Methods: We included 221 eyes of 221 patients. Progression was analyzed using Nonparametric Progression Analysis applied to Humphrey

  20. Value Added Productivity Indicators: A Statistical Comparison of the Pre-Test/Post-Test Model and Gain Model.

    Science.gov (United States)

    Weerasinghe, Dash; Orsak, Timothy; Mendro, Robert

    In an age of student accountability, public school systems must find procedures for identifying effective schools, classrooms, and teachers that help students continue to learn academically. As a result, researchers have been modeling schools and classrooms to calculate productivity indicators that will withstand not only statistical review but…

  1. Effect of higher order nonlinearity, directionality and finite water depth on wave statistics: Comparison of field data and numerical simulations

    Science.gov (United States)

    Fernández, Leandro; Monbaliu, Jaak; Onorato, Miguel; Toffoli, Alessandro

    2014-05-01

    This research is focused on the study of nonlinear evolution of irregular wave fields in water of arbitrary depth by comparing field measurements and numerical simulations.It is now well accepted that modulational instability, known as one of the main mechanisms for the formation of rogue waves, induces strong departures from Gaussian statistics. However, whereas non-Gaussian properties are remarkable when wave fields follow one direction of propagation over an infinite water depth, wave statistics only weakly deviate from Gaussianity when waves spread over a range of different directions. Over finite water depth, furthermore, wave instability attenuates overall and eventually vanishes for relative water depths as low as kh=1.36 (where k is the wavenumber of the dominant waves and h the water depth). Recent experimental results, nonetheless, seem to indicate that oblique perturbations are capable of triggering and sustaining modulational instability even if khthe aim of this research is to understand whether the combined effect of directionality and finite water depth has a significant effect on wave statistics and particularly on the occurrence of extremes. For this purpose, numerical experiments have been performed solving the Euler equation of motion with the Higher Order Spectral Method (HOSM) and compared with data of short crested wave fields for different sea states observed at the Lake George (Australia). A comparative analysis of the statistical properties (i.e. density function of the surface elevation and its statistical moments skewness and kurtosis) between simulations and in-situ data provides a confrontation between the numerical developments and real observations in field conditions.

  2. Comparison of untreated adolescent idiopathic scoliosis with normal controls: a review and statistical analysis of the literature.

    Science.gov (United States)

    Rushton, Paul R P; Grevitt, Michael P

    2013-04-20

    Review and statistical analysis of studies evaluating health-related quality of life (HRQOL) in adolescents with untreated adolescent idiopathic scoliosis (AIS) using Scoliosis Research Society (SRS) outcomes. To apply normative values and minimum clinical important differences for the SRS-22r to the literature. Identify whether the HRQOL of adolescents with untreated AIS differs from unaffected peers and whether any differences are clinically relevant. The effect of untreated AIS on adolescent HRQOL is uncertain. The lack of published normative values and minimum clinical important difference for the SRS-22r has so far hindered our interpretation of previous studies. The publication of this background data allows these studies to be re-examined. Using suitable inclusion criteria, a literature search identified studies examining HRQOL in untreated adolescents with AIS. Each cohort was analyzed individually. Statistically significant differences were identified by using 95% confidence intervals for the difference in SRS-22r domain mean scores between the cohorts with AIS and the published data for unaffected adolescents. If the lower bound of the confidence interval was greater than the minimum clinical important difference, the difference was considered clinically significant. Of the 21 included patient cohorts, 81% reported statistically worse pain than those unaffected. Yet in only 5% of cohorts was this difference clinically important. Of the 11 cohorts included examining patient self-image, 91% reported statistically worse scores than those unaffected. In 73% of cohorts this difference was clinically significant. Affected cohorts tended to score well in function/activity and mental health domains and differences from those unaffected rarely reached clinically significant values. Pain and self-image tend to be statistically lower among cohorts with AIS than those unaffected. The literature to date suggests that it is only self-image which consistently differs

  3. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  4. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs): a comparison of nine published papers.

    Science.gov (United States)

    Festing, Michael F W

    2014-01-01

    The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  5. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs: a comparison of nine published papers.

    Directory of Open Access Journals (Sweden)

    Michael F W Festing

    Full Text Available The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  6. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  7. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    Science.gov (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  8. Comparison of single distance phase retrieval algorithms by considering different object composition and the effect of statistical and structural noise.

    Science.gov (United States)

    Chen, R C; Rigon, L; Longo, R

    2013-03-25

    Phase retrieval is a technique for extracting quantitative phase information from X-ray propagation-based phase-contrast tomography (PPCT). In this paper, the performance of different single distance phase retrieval algorithms will be investigated. The algorithms are herein called phase-attenuation duality Born Algorithm (PAD-BA), phase-attenuation duality Rytov Algorithm (PAD-RA), phase-attenuation duality Modified Bronnikov Algorithm (PAD-MBA), phase-attenuation duality Paganin algorithm (PAD-PA) and phase-attenuation duality Wu Algorithm (PAD-WA), respectively. They are all based on phase-attenuation duality property and on weak absorption of the sample and they employ only a single distance PPCT data. In this paper, they are investigated via simulated noise-free PPCT data considering the fulfillment of PAD property and weakly absorbing conditions, and with experimental PPCT data of a mixture sample containing absorbing and weakly absorbing materials, and of a polymer sample considering different degrees of statistical and structural noise. The simulation shows all algorithms can quantitatively reconstruct the 3D refractive index of a quasi-homogeneous weakly absorbing object from noise-free PPCT data. When the weakly absorbing condition is violated, the PAD-RA and PAD-PA/WA obtain better result than PAD-BA and PAD-MBA that are shown in both simulation and mixture sample results. When considering the statistical noise, the contrast-to-noise ratio values decreases as the photon number is reduced. The structural noise study shows that the result is progressively corrupted by ring-like artifacts with the increase of structural noise (i.e. phantom thickness). The PAD-RA and PAD-PA/WA gain better density resolution than the PAD-BA and PAD-MBA in both statistical and structural noise study.

  9. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    Science.gov (United States)

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  10. Everything That’s Wrong with E-Book Statistics: A Comparison of E-Book Packages

    OpenAIRE

    Byström, Karin

    2013-01-01

    This poster presentation highlights the problems that exist in defining “a download” for e-books. Even though there is a COUNTER code of practice, a download can still be defined as either a page, chapter or title use. Many e-book publishers don’t follow COUNTER at all, and then the differences are even bigger. Libraries face many problems because of this, and this poster aims to raise awareness on the problems concerning analyzing e-book usage statistics.

  11. Statistical Studies of Solar White-light Flares and Comparisons with Superflares on Solar-type Stars

    Science.gov (United States)

    Namekata, Kosuke; Sakaue, Takahito; Watanabe, Kyoko; Asai, Ayumi; Maehara, Hiroyuki; Notsu, Yuta; Notsu, Shota; Honda, Satoshi; Ishii, Takako T.; Ikuta, Kai; Nogami, Daisaku; Shibata, Kazunari

    2017-12-01

    Recently, many superflares on solar-type stars have been discovered as white-light flares (WLFs). The statistical study found a correlation between their energies (E) and durations (τ): τ \\propto {E}0.39, similar to those of solar hard/soft X-ray flares, τ \\propto {E}0.2{--0.33}. This indicates a universal mechanism of energy release on solar and stellar flares, i.e., magnetic reconnection. We here carried out statistical research on 50 solar WLFs observed with Solar Dynamics Observatory/HMI and examined the correlation between the energies and durations. As a result, the E–τ relation on solar WLFs (τ \\propto {E}0.38) is quite similar to that on stellar superflares (τ \\propto {E}0.39). However, the durations of stellar superflares are one order of magnitude shorter than those expected from solar WLFs. We present the following two interpretations for the discrepancy: (1) in solar flares, the cooling timescale of WLFs may be longer than the reconnection one, and the decay time of solar WLFs can be elongated by the cooling effect; (2) the distribution can be understood by applying a scaling law (τ \\propto {E}1/3{B}-5/3) derived from the magnetic reconnection theory. In the latter case, the observed superflares are expected to have 2–4 times stronger magnetic field strength than solar flares.

  12. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  13. Paired preference data with a no-preference option – Statistical tests for comparison with placebo data

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Ennis, John M.; Ennis, Daniel M.

    2014-01-01

    /preference responses or ties in choice experiments. Food Quality and Preference, 23, 13–17) noted that this proportion can depend on the product category, have proposed that the expected proportion of preference responses within a given category be called an identicality norm, and have argued that knowledge...... of such norms is valuable for more complete interpretation of 2-Alternative Choice (2-AC) data. For instance, these norms can be used to indicate consumer segmentation even with non-replicated data. In this paper, we show that the statistical test suggested by Ennis and Ennis (2012a) behaves poorly and has too...... when ingredient changes are considered for cost-reduction or health initiative purposes....

  14. Classification of bladder cancer cell lines using Raman spectroscopy: a comparison of excitation wavelength, sample substrate and statistical algorithms

    Science.gov (United States)

    Kerr, Laura T.; Adams, Aine; O'Dea, Shirley; Domijan, Katarina; Cullen, Ivor; Hennelly, Bryan M.

    2014-05-01

    Raman microspectroscopy can be applied to the urinary bladder for highly accurate classification and diagnosis of bladder cancer. This technique can be applied in vitro to bladder epithelial cells obtained from urine cytology or in vivo as an optical biopsy" to provide results in real-time with higher sensitivity and specificity than current clinical methods. However, there exists a high degree of variability across experimental parameters which need to be standardised before this technique can be utilized in an everyday clinical environment. In this study, we investigate different laser wavelengths (473 nm and 532 nm), sample substrates (glass, fused silica and calcium fluoride) and multivariate statistical methods in order to gain insight into how these various experimental parameters impact on the sensitivity and specificity of Raman cytology.

  15. R for statistics

    CERN Document Server

    Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent

    2012-01-01

    An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...

  16. Statistical Study in the mid-altitude cusp region: wave and particle data comparison using a normalized cusp crossing duration

    Science.gov (United States)

    Grison, B.; Escoubet, C. P.; Pitout, F.; Cornilleau-Wehrlin, N.; Dandouras, I.; Lucek, E.

    2009-04-01

    In the mid altitude cusp region the DC magnetic field presents a diamagnetic cavity due to intense ion earthward flux coming from the magnetosheath. A strong ultra low frequency (ULF) magnetic activity is also commonly observed in this region. Most of the mid altitude cusp statistical studies have focused on the location of the cusp and its dependence and response to solar wind, interplanetary magnetic field, dipole tilt angle parameters. In our study we use the database build by Pitout et al. (2006) in order to study the link of wave power in the ULF range (0.35-10Hz) measured by STAFF SC instrument with the ion plasma properties as measured by CIS (and CODIF) instrument as well as the diamagnetic cavity in the mid-altitude cusp region with FGM data. To compare the different crossings we don`t use the cusp position and dynamics but we use a normalized cusp crossing duration that permits to easily average the properties over a large number of crossings. As usual in the cusp, it is particularly relevant to sort the crossings by the corresponding interplanetary magnetic field (IMF) orientation in order to analyse the results. In particular we try to find out what is the most relevant parameter to link the strong wave activity with. The global statistic confirms previous single case observations that have noticed a simultaneity between ion injections and wave activity enhancements. We will also present results concerning other ion parameters and the diamagnetic cavity observed in the mid altitude cusp region.

  17. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    Science.gov (United States)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  18. The statistical evaluation and comparison of ADMS-Urban model for the prediction of nitrogen dioxide with air quality monitoring network.

    Science.gov (United States)

    Dėdelė, Audrius; Miškinytė, Auksė

    2015-09-01

    In many countries, road traffic is one of the main sources of air pollution associated with adverse effects on human health and environment. Nitrogen dioxide (NO2) is considered to be a measure of traffic-related air pollution, with concentrations tending to be higher near highways, along busy roads, and in the city centers, and the exceedances are mainly observed at measurement stations located close to traffic. In order to assess the air quality in the city and the air pollution impact on public health, air quality models are used. However, firstly, before the model can be used for these purposes, it is important to evaluate the accuracy of the dispersion modelling as one of the most widely used method. The monitoring and dispersion modelling are two components of air quality monitoring system (AQMS), in which statistical comparison was made in this research. The evaluation of the Atmospheric Dispersion Modelling System (ADMS-Urban) was made by comparing monthly modelled NO2 concentrations with the data of continuous air quality monitoring stations in Kaunas city. The statistical measures of model performance were calculated for annual and monthly concentrations of NO2 for each monitoring station site. The spatial analysis was made using geographic information systems (GIS). The calculation of statistical parameters indicated a good ADMS-Urban model performance for the prediction of NO2. The results of this study showed that the agreement of modelled values and observations was better for traffic monitoring stations compared to the background and residential stations.

  19. Selenium determination in biological materials by neutron activation analysis - statistical comparison between the use of 77m Se (t1/2 17.45 s) and 75 Se (t1/2 = 119,8 d)

    International Nuclear Information System (INIS)

    Catharino, Marilia G.M.; Vasconcelos, Marina B.A.

    2002-01-01

    Selenium is nowadays considered to be an essential trace element in human diet. The most extensively studied biochemical role of this element is related to its participation in the composition of glutathione peroxidase. This enzyme acts as an antioxidant for the free radicals formed in the human body. In the present work, selenium was determined by INAA in reference materials ('Human hair' IAEA-085, 'Human hair' IAEA-086, 'Dogfish Liver' DOLT-1 e 'Dogfish Muscle' DORM-1) and in toenails and vitamin supplement, using the short-lived radioisotope 77m Se. The usual method, which utilizes long- lived 75 Se, was also employed, in order to make a comparative study. A statistical test was applied for this comparison. It was verified that the average concentrations of selenium, in the reference materials and in the samples analyzed, do not differ statistically at a significance level of 0.05, which indicates the applicability of the short-lived 77m Se for INAA of the matrixes studied. (author)

  20. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    Science.gov (United States)

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  1. Low-frequency variability of the atmospheric circulation: a comparison of statistical properties in both hemispheres and extreme seasons

    International Nuclear Information System (INIS)

    Buzzi, A.; Tosi, E.

    1988-01-01

    A statistical investigation is presented of the main variables characterizing the tropospheric general circulation in both hemispheres and extreme season, Winter and Summer. This gives up the opportunity of comparing four distinct realizations of the planetary circulation, as function of different orographic and thermal forcing conditions. Our approach is made possible by the availability of 6 years of global daily analyses prepared by ECMWF (European Centre for Medium-range Weather Forecast). The variables taken into account are the zonal geostrophic wind, the zonal thermal wind and various large-scala wave components, averaged over the tropospheric depth between 1000 and 200 hPa. The mean properties of the analysed quantities in each hemisphere and season are compared and their principal characteristics are discussed. The probability density estimates for the same variables, filtered in order to eliminate the seasonal cycle and the high frequency 'noise', are then presented. The distributions are examined, in particular, with respect of their unimodal or multimodal nature and with reference to the recent discussion in the literature on the bimodality which has been found for some indicators of planetary wave activity in the Nothern Hemisphere Winter. Our results indicate the presence of nonunimodally distributed wave and zonal flow components in both hemispheres and extreme season. The most frequent occurrence of nonunimodal behaviour is found for those wave components which exhibit an almost vanishing zonal phase speed and a larger 'response' to orographic forcing

  2. Statistical evaluation of essential/toxic metal levels in the blood of valvular heart disease patients in comparison with controls.

    Science.gov (United States)

    Ilyas, Asim; Shah, Munir H

    2017-05-12

    The present study was designed to investigate the role of selected essential and toxic metals in the onset/prognosis of valvular heart disease (VHD). Nitric acid-perchloric acid based wet digestion procedure was used for the quantification of the metals by flame atomic absorption spectrophotometry. Comparative appraisal of the data revealed that average levels of Cd, Co, Cr, Fe, K, Li, Mn and Zn were significantly higher in blood of VHD patients, while the average concentration of Ca was found at elevated level in controls (P < 0.05). However, Cu, Mg, Na, Sr and Pb depicted almost comparable levels in the blood of both donor groups. The correlation study revealed significantly different mutual associations among the metals in the blood of VHD patients compared with the controls. Multivariate statistical methods showed substantially divergent grouping of the metals for the patients and controls. Some significant differences in the metal concentrations were also observed with gender, abode, dietary/smoking habits and occupations of both donor groups. Overall, the study demonstrated that disproportions in the concentrations of essential/toxic metals in the blood are involved in pathogenesis of the disease.

  3. Implant-assisted magnetic drug targeting in permeable microvessels: Comparison of two-fluid statistical transport model with experiment

    Energy Technology Data Exchange (ETDEWEB)

    ChiBin, Zhang; XiaoHui, Lin, E-mail: lxh60@seu.edu.cn; ZhaoMin, Wang; ChangBao, Wang

    2017-03-15

    In experiments and theoretical analyses, this study examines the capture efficiency (CE) of magnetic drug carrier particles (MDCPs) for implant-assisted magnetic drug targeting (IA-MDT) in microvessels. It also proposes a three-dimensional statistical transport model of MDCPs for IA-MDT in permeable microvessels, which describes blood flow by the two-fluid (Casson and Newtonian) model. The model accounts for the permeable effect of the microvessel wall and the coupling effect between the blood flow and tissue fluid flow. The MDCPs move randomly through the microvessel, and their transport state is described by the Boltzmann equation. The regulated changes and factors affecting the CE of the MDCPs in the assisted magnetic targeting were obtained by solving the theoretical model and by experimental testing. The CE was negatively correlated with the blood flow velocity, and positively correlated with the external magnetic field intensity and microvessel permeability. The predicted CEs of the MDCPs were consistent with the experimental results. Additionally, under the same external magnetic field, the predicted CE was 5–8% higher in the IA-MDT model than in the model ignoring the permeability effect of the microvessel wall. - Highlights: • A model of MDCPs for IA-MDT in permeable microvessels was established. • An experimental device was established, the CE of MDCPs was measured. • The predicted CE of MDCPs was 5–8% higher in the IA-MDT model.

  4. Model-based iterative reconstruction for reduction of radiation dose in abdominopelvic CT: comparison to adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2013-12-01

    To evaluate dose reduction and image quality of abdominopelvic computed tomography (CT) reconstructed with model-based iterative reconstruction (MBIR) compared to adaptive statistical iterative reconstruction (ASIR). In this prospective study, 85 patients underwent referential-, low-, and ultralow-dose unenhanced abdominopelvic CT. Images were reconstructed with ASIR for low-dose (L-ASIR) and ultralow-dose CT (UL-ASIR), and with MBIR for ultralow-dose CT (UL-MBIR). Image noise was measured in the abdominal aorta and iliopsoas muscle. Subjective image analyses and a lesion detection study (adrenal nodules) were conducted by two blinded radiologists. A reference standard was established by a consensus panel of two different radiologists using referential-dose CT reconstructed with filtered back projection. Compared to low-dose CT, there was a 63% decrease in dose-length product with ultralow-dose CT. UL-MBIR had significantly lower image noise than L-ASIR and UL-ASIR (all pASIR and UL-ASIR (all pASIR in diagnostic acceptability (p>0.65), or diagnostic performance for adrenal nodules (p>0.87). MBIR significantly improves image noise and streak artifacts compared to ASIR, and can achieve radiation dose reduction without severely compromising image quality.

  5. Statistical Comparison of Cloud and Aerosol Vertical Properties between Two Eastern China Regions Based on CloudSat/CALIPSO Data

    Directory of Open Access Journals (Sweden)

    Yujun Qiu

    2017-01-01

    Full Text Available The relationship between cloud and aerosol properties was investigated over two 4° × 4° adjacent regions in the south (R1 and in the north (R2 in eastern China. The CloudSat/CALIPSO data were used to extract the cloud and aerosol profiles properties. The mean value of cloud occurrence probability (COP was the highest in the mixed cloud layer (−40°C~0°C and the lowest in the warm cloud layer (>0°C. The atmospheric humidity was more statistically relevant to COP in the warm cloud layer than aerosol condition. The differences in COP between the two regions in the mixed cloud layer and ice cloud layer (<−40°C had good correlations with those in the aerosol extinction coefficient. A radar reflectivity factor greater than −10 dBZ occurred mainly in warm cloud layers and mixed cloud layers. A high-COP zone appeared in the above-0°C layer with cloud thicknesses of 2-3 km in both regions and in all the four seasons, but the distribution of the zonal layer in R2 was more continuous than that in R1, which was consistent with the higher aerosol optical thickness in R2 than in R1 in the above-0°C layer, indicating a positive correlation between aerosol and cloud probability.

  6. A statistical evaluation of the radionuclide inventory in the 222-S Laboratory and a comparison with the current inventory estimates

    International Nuclear Information System (INIS)

    Painter, C.L.; Daly, D.S.; Welsh, T.L.

    1996-09-01

    Department of Energy contractors responsible for the safe operation of non-reactor nuclear facilities need to establish maximum inventory limits on both radioactive and hazardous materials stored, used, or processed at their facilities. These contractors need to ensure that established maximum limits are not exceeded. This necessity is derived from the requirement to demonstrate that a non-reactor facility can by safely operated during normal and abnormal conditions. The 222-S Laboratory has established that a maximum of 800 core equivalent samples (CES)may be stored at the Laboratory. Using the CES definition one can determine the maximum allowed curies pre radionuclide permitted. These estimates were made using a variation on a statistical technique called smoothed-bootstrapping. Smoothed- bootstrapping is a method analogous to Monte Carlo simulation using a smoothed empirical probability distribution. This report discusses the application of the smoothed bootstrap technique to predicting the curie inventory retained in the numerous waste tank samples for the radionuclides of concern and the uncertainty associated with such a prediction

  7. Comparison of Statistically Modeled Contaminated Soil Volume Estimates and Actual Excavation Volumes at the Maywood FUSRAP Site - 13555

    Energy Technology Data Exchange (ETDEWEB)

    Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)

    2013-07-01

    As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)

  8. Decision making in pathological gambling: A comparison between pathological gamblers, alcohol dependents, persons with Tourette syndrome, and normal controls

    NARCIS (Netherlands)

    Goudriaan, Anna E.; Oosterlaan, Jaap; de Beurs, Edwin; van den Brink, Wim

    2005-01-01

    Decision making deficits play an important role in the definition of pathological gambling (PG). However, only few empirical studies are available regarding decision making processes in PG. This study therefore compares decision making processes in PG and normal controls in detail using three

  9. Vascular disease in women: comparison of diagnoses in hospital episode statistics and general practice records in England

    Directory of Open Access Journals (Sweden)

    Wright F

    2012-10-01

    Full Text Available Abstract Background Electronic linkage to routine administrative datasets, such as the Hospital Episode Statistics (HES in England, is increasingly used in medical research. Relatively little is known about the reliability of HES diagnostic information for epidemiological studies. In the United Kingdom (UK, general practitioners hold comprehensive records for individuals relating to their primary, secondary and tertiary care. For a random sample of participants in a large UK cohort, we compared vascular disease diagnoses in HES and general practice records to assess agreement between the two sources. Methods Million Women Study participants with a HES record of hospital admission with vascular disease (ischaemic heart disease [ICD-10 codes I20-I25], cerebrovascular disease [G45, I60-I69] or venous thromboembolism [I26, I80-I82] between April 1st 1997 and March 31st 2005 were identified. In each broad diagnostic group and in women with no such HES diagnoses, a random sample of about a thousand women was selected for study. We asked each woman’s general practitioner to provide information on her history of vascular disease and this information was compared with the HES diagnosis record. Results Over 90% of study forms sent to general practitioners were returned and 88% of these contained analysable data. For the vast majority of study participants for whom information was available, diagnostic information from general practice and HES records was consistent. Overall, for 93% of women with a HES diagnosis of vascular disease, general practice records agreed with the HES diagnosis; and for 97% of women with no HES diagnosis of vascular disease, the general practitioner had no record of a diagnosis of vascular disease. For severe vascular disease, including myocardial infarction (I21-22, stroke, both overall (I60-64 and by subtype, and pulmonary embolism (I26, HES records appeared to be both reliable and complete. Conclusion Hospital admission data

  10. Columnar Aerosol Properties from Sun-and-star Photometry: Statistical Comparisons and Day-to-night Dynamic

    Science.gov (United States)

    Ramirez, Daniel Perez; Lyamani, H.; Olmo, F. J.; Whiteman, D. N.; Alados-Arboledas, L.

    2012-01-01

    This work presents the first analysis of longterm correlative day-to-night columnar aerosol optical properties. The aim is to better understand columnar aerosol dynamic from ground-based observations, which are poorly studied until now. To this end we have used a combination of sun-and-star photometry measurements acquired in the city of Granada (37.16 N, 3.60 W, 680 ma.s.l.; South-East of Spain) from 2007 to 2010. For the whole study period, mean aerosol optical depth (AOD) around 440 nm (+/-standard deviation) is 0.18 +/- 0.10 and 0.19 +/- 0.11 for daytime and nighttime, respectively, while the mean Angstr¨om exponent (alpha ) is 1.0 +/- 0.4 and 0.9 +/- 0.4 for daytime and nighttime. The ANOVA statistical tests reveal that there are no significant differences between AOD and obtained at daytime and those at nighttime. Additionally, the mean daytime values of AOD and obtained during this study period are coherent with the values obtained in the surrounding AERONET stations. On the other hand, AOD around 440 nm present evident seasonal patterns characterised by large values in summer (mean value of 0.20 +/- 0.10 both at daytime and nighttime) and low values in winter (mean value of 0.15 +/- 0.09 at daytime and 0.17 +/- 0.10 at nighttime). The Angstr¨om exponents also present seasonal patterns, but with low values in summer (mean values of 0.8 +/- 0.4 and 0.9 +/- 0.4 at dayand night-time) and relatively large values in winter (mean values of 1.2 +/- 0.4 and 1.0 +/- 0.3 at daytime and nighttime). These seasonal patterns are explained by the differences in the meteorological conditions and by the differences in the strength of the aerosol sources. To take more insight about the changes in aerosol particles between day and night, the spectral differences of the Angstrom exponent as function of the Angstr¨om exponent are also studied. These analyses reveal increases of the fine mode radius and of the fine mode contribution to AOD during nighttime, being more

  11. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2018-01-01

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input ‘for processing’ DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice’s coefficient (DC) of 0.79  ±  0.13 and Pearson’s correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as

  12. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning.

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A; Roubidoux, Marilyn A; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M; Samala, Ravi K

    2018-01-09

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input 'for processing' DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice's coefficient (DC) of 0.79  ±  0.13 and Pearson's correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as well as

  13. Distinction between externally vs. internally guided decision-making: Operational differences, meta-analytical comparisons and their theoretical implications

    Directory of Open Access Journals (Sweden)

    Takashi eNakao

    2012-03-01

    Full Text Available Most experimental studies of decision-making have specifically examined situations in which a single less-predictable correct answer exists (externally guided decision-making under uncertainty. Along with such externally guided decision-making, there are instances of decision making in which no correct answer based on external circumstances is available for the subject (internally guided decision-making. Such decisions are usually made in the context of moral decision making as well as in preference judgment, where the answer depends on the subject’s own, i.e. internal, preferences rather than on external, i.e. circumstantial, criteria. The neuronal and psychological mechanisms that allow guidance of decisions based on more internally oriented criteria in the absence of external ones remain unclear. This study was undertaken to compare decision making of these two kinds empirically and theoretically. First, we reviewed studies of decision making to clarify experimental–operational differences between externally guided and internally guided decision-making. Second, using MKDA, a whole-brain-based quantitative meta-analysis of neuroimaging studies was performed. Our meta-analysis revealed that the neural network used predominantly for internally guided decision-making differs from that for externally guided decision-making under uncertainty. This result suggests that studying only externally guided decision-making under uncertainty is insufficient to account for decision-making processes in the brain. Finally, based on the review and results of the meta-analysis, we discuss the differences and relations between decision making of these two types in terms of their operational, neuronal, and theoretical characteristics.

  14. A statistical comparison of SuperDARN spectral width boundaries and DMSP particle precipitation boundaries in the morning sector ionosphere

    Directory of Open Access Journals (Sweden)

    G. Chisham

    2005-03-01

    Full Text Available Determining reliable proxies for the ionospheric signature of the open-closed field line boundary (OCB is crucial for making accurate ionospheric measurements of many magnetospheric processes (e.g. magnetic reconnection. This study compares the latitudes of Spectral Width Boundaries (SWBs, identified in the morning sector ionosphere using the Super Dual Auroral Radar Network (SuperDARN, with Particle Precipitation Boundaries (PPBs determined using the low-altitude Defense Meteorological Satellite Program (DMSP spacecraft, in order to determine whether the SWB represents a good proxy for the ionospheric projection of the OCB. The latitudes of SWBs and PPBs were identified using automated algorithms applied to 5 years (1997-2001 of data measured in the 00:00-12:00 Magnetic Local Time (MLT range. A latitudinal difference was measured between each PPB and the nearest SWB within a ±10min Universal Time (UT window and within a ±1h MLT window. The results show that the SWB represents a good proxy for the OCB close to midnight (~00:00-02:00 MLT and noon (~08:00-12:00 MLT, but is located some distance (~2°-4° equatorward of the OCB across much of the morning sector ionosphere (~02:00-08:00 MLT. On the basis of this and other studies we deduce that the SWB is correlated with the poleward boundary of auroral emissions in the Lyman-Birge-Hopfield ``Long" (LBHL UV emission range and hence, that spectral width is inversely correlated with the energy flux of precipitating electrons. We further conclude that the combination of two factors may explain the spatial distribution of spectral width values in the polar ionospheres. The small-scale structure of the convection electric field leads to an enhancement in spectral width in regions close to the OCB, whereas increases in ionospheric conductivity (relating to the level of incident electron energy flux lead to a reduction in spectral width in regions just equatorward of the OCB.

  15. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  16. Validation of Refractivity Profiles Retrieved from FORMOSAT-3/COSMIC Radio Occultation Soundings: Preliminary Results of Statistical Comparisons Utilizing Balloon-Borne Observations

    Directory of Open Access Journals (Sweden)

    Hiroo Hayashi

    2009-01-01

    Full Text Available The GPS radio occultation (RO soundings by the FORMOSAT-3/COSMIC (Taiwan¡¦s Formosa Satellite Misssion #3/Constellation Observing System for Meteorology, Ionosphere and Climate satellites launched in mid-April 2006 are compared with high-resolution balloon-borne (radiosonde and ozonesonde observations. This paper presents preliminary results of validation of the COSMIC RO measurements in terms of refractivity through the troposphere and lower stratosphere. With the use of COSMIC RO soundings within 2 hours and 300 km of sonde profiles, statistical comparisons between the collocated refractivity profiles are erformed for some tropical regions (Malaysia and Western Pacific islands where moisture-rich air is expected in the lower troposphere and for both northern and southern polar areas with a very dry troposphere. The results of the comparisons show good agreement between COSMIC RO and sonde refractivity rofiles throughout the troposphere (1 - 1.5% difference at most with a positive bias generally becoming larger at progressively higher altitudes in the lower stratosphere (1 - 2% difference around 25 km, and a very small standard deviation (about 0.5% or less for a few kilometers below the tropopause level. A large standard deviation of fractional differences in the lowermost troposphere, which reaches up to as much as 3.5 - 5%at 3 km, is seen in the tropics while a much smaller standard deviation (1 - 2% at most is evident throughout the polar troposphere.

  17. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  18. ArrayVigil: a methodology for statistical comparison of gene signatures using segregated-one-tailed (SOT) Wilcoxon's signed-rank test.

    Science.gov (United States)

    Khan, Haseeb Ahmad

    2005-01-28

    Due to versatile diagnostic and prognostic fidelity molecular signatures or fingerprints are anticipated as the most powerful tools for cancer management in the near future. Notwithstanding the experimental advancements in microarray technology, methods for analyzing either whole arrays or gene signatures have not been firmly established. Recently, an algorithm, ArraySolver has been reported by Khan for two-group comparison of microarray gene expression data using two-tailed Wilcoxon signed-rank test. Most of the molecular signatures are composed of two sets of genes (hybrid signatures) wherein up-regulation of one set and down-regulation of the other set collectively define the purpose of a gene signature. Since the direction of a selected gene's expression (positive or negative) with respect to a particular disease condition is known, application of one-tailed statistics could be a more relevant choice. A novel method, ArrayVigil, is described for comparing hybrid signatures using segregated-one-tailed (SOT) Wilcoxon signed-rank test and the results compared with integrated-two-tailed (ITT) procedures (SPSS and ArraySolver). ArrayVigil resulted in lower P values than those obtained from ITT statistics while comparing real data from four signatures.

  19. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  20. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  1. Multi-criteria decision making with linguistic labels: a comparison of two methodologies applied to energy planning

    OpenAIRE

    Afsordegan, Arayeh; Sánchez Soler, Monica; Agell Jané, Núria; Cremades Oliver, Lázaro Vicente; Zahedi, Siamak

    2014-01-01

    This paper compares two multi-criteria decision making (MCDM) approaches based on linguistic label assessment. The first approach consists of a modified fuzzy TOPSIS methodology introduced by Kaya and Kahraman in 2011. The second approach, introduced by Agell et al. in 2012, is based on qualitative reasoning techniques for ranking multi-attribute alternatives in group decision-making with linguistic labels. Both approaches are applied to a case of assessment and selection of the most suita...

  2. A comparison of 99mTc-HMPAO SPET changes in dementia with Lewy bodies and Alzheimer's disease using statistical parametric mapping

    International Nuclear Information System (INIS)

    Colloby, Sean J.; Paling, Sean M.; Lobotesis, Kyriakos; Ballard, Clive; McKeith, Ian; O'Brien, John T.; Fenwick, John D.; Williams, David E.

    2002-01-01

    Differences in regional cerebral blood flow (rCBF) between subjects with Alzheimer's disease (AD), dementia with Lewy bodies (DLB) and healthy volunteers were investigated using statistical parametric mapping (SPM99). Forty-eight AD, 23 DLB and 20 age-matched control subjects participated. Technetium-99m hexamethylpropylene amine oxime (HMPAO) brain single-photon emission tomography (SPET) scans were acquired for each subject using a single-headed rotating gamma camera (IGE CamStar XR/T). The SPET images were spatially normalised and group comparison was performed by SPM99. In addition, covariate analysis was undertaken on the standardised images taking the Mini Mental State Examination (MMSE) scores as a variable. Applying a height threshold of P≤0.001 uncorrected, significant perfusion deficits in the parietal and frontal regions of the brain were observed in both AD and DLB groups compared with the control subjects. In addition, significant temporoparietal perfusion deficits were identified in the AD subjects, whereas the DLB patients had deficits in the occipital region. Comparison of dementia groups (height threshold of P≤0.01 uncorrected) yielded hypoperfusion in both the parietal [Brodmann area (BA) 7] and occipital (BA 17, 18) regions of the brain in DLB compared with AD. Abnormalities in these areas, which included visual cortex and several areas involved in higher visual processing and visuospatial function, may be important in understanding the visual hallucinations and visuospatial deficits which are characteristic of DLB. Covariate analysis indicated group differences between AD and DLB in terms of a positive correlation between cognitive test score and temporoparietal blood flow. In conclusion, we found evidence of frontal and parietal hypoperfusion in both AD and DLB, while temporal perfusion deficits were observed exclusively in AD and parieto-occipital deficits in DLB. (orig.)

  3. Modeling polymer-induced interactions between two grafted surfaces: comparison between interfacial statistical associating fluid theory and self-consistent field theory.

    Science.gov (United States)

    Jain, Shekhar; Ginzburg, Valeriy V; Jog, Prasanna; Weinhold, Jeffrey; Srivastava, Rakesh; Chapman, Walter G

    2009-07-28

    The interaction between two polymer grafted surfaces is important in many applications, such as nanocomposites, colloid stabilization, and polymer alloys. In our previous work [Jain et al., J. Chem. Phys. 128, 154910 (2008)], we showed that interfacial statistical associating fluid density theory (iSAFT) successfully calculates the structure of grafted polymer chains in the absence/presence of a free polymer. In the current work, we have applied this density functional theory to calculate the force of interaction between two such grafted monolayers in implicit good solvent conditions. In particular, we have considered the case where the segment sizes of the free (sigma(f)) and grafted (sigma(g)) polymers are different. The interactions between the two monolayers in the absence of the free polymer are always repulsive. However, in the presence of the free polymer, the force either can be purely repulsive or can have an attractive minimum depending upon the relative chain lengths of the free (N(f)) and grafted polymers (N(g)). The attractive minimum is observed only when the ratio alpha = N(f)/N(g) is greater than a critical value. We find that these critical values of alpha satisfy the following scaling relation: rho(g) square root(N(g)) beta(3) proportional to alpha(-lambda), where beta = sigma(f)/sigma(g) and lambda is the scaling exponent. For beta = 1 or the same segment sizes of the free and grafted polymers, this scaling relation is in agreement with those from previous theoretical studies using self-consistent field theory (SCFT). Detailed comparisons between iSAFT and SCFT are made for the structures of the monolayers and their forces of interaction. These comparisons lead to interesting implications for the modeling of nanocomposite thermodynamics.

  4. A comparison of {sup 99m}Tc-HMPAO SPET changes in dementia with Lewy bodies and Alzheimer's disease using statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Colloby, Sean J.; Paling, Sean M.; Lobotesis, Kyriakos; Ballard, Clive; McKeith, Ian; O' Brien, John T. [Wolfson Research Centre, Institute for Ageing and Health, Newcastle upon Tyne (United Kingdom); Fenwick, John D. [Regional Medical Physics Department, Newcastle General Hospital, Newcastle upon Tyne (United Kingdom); Williams, David E. [Regional Medical Physics Department, Sunderland Royal Hospital (United Kingdom)

    2002-05-01

    Differences in regional cerebral blood flow (rCBF) between subjects with Alzheimer's disease (AD), dementia with Lewy bodies (DLB) and healthy volunteers were investigated using statistical parametric mapping (SPM99). Forty-eight AD, 23 DLB and 20 age-matched control subjects participated. Technetium-99m hexamethylpropylene amine oxime (HMPAO) brain single-photon emission tomography (SPET) scans were acquired for each subject using a single-headed rotating gamma camera (IGE CamStar XR/T). The SPET images were spatially normalised and group comparison was performed by SPM99. In addition, covariate analysis was undertaken on the standardised images taking the Mini Mental State Examination (MMSE) scores as a variable. Applying a height threshold of P{<=}0.001 uncorrected, significant perfusion deficits in the parietal and frontal regions of the brain were observed in both AD and DLB groups compared with the control subjects. In addition, significant temporoparietal perfusion deficits were identified in the AD subjects, whereas the DLB patients had deficits in the occipital region. Comparison of dementia groups (height threshold of P{<=}0.01 uncorrected) yielded hypoperfusion in both the parietal [Brodmann area (BA) 7] and occipital (BA 17, 18) regions of the brain in DLB compared with AD. Abnormalities in these areas, which included visual cortex and several areas involved in higher visual processing and visuospatial function, may be important in understanding the visual hallucinations and visuospatial deficits which are characteristic of DLB. Covariate analysis indicated group differences between AD and DLB in terms of a positive correlation between cognitive test score and temporoparietal blood flow. In conclusion, we found evidence of frontal and parietal hypoperfusion in both AD and DLB, while temporal perfusion deficits were observed exclusively in AD and parieto-occipital deficits in DLB. (orig.)

  5. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    Directory of Open Access Journals (Sweden)

    Michael Robert Cunningham

    2016-10-01

    Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.

  6. Radiation effects on cancer mortality among A-bomb survivors, 1950-72. Comparison of some statistical models and analysis based on the additive logit model

    Energy Technology Data Exchange (ETDEWEB)

    Otake, M [Hiroshima Univ. (Japan). Faculty of Science

    1976-12-01

    Various statistical models designed to determine the effects of radiation dose on mortality of atomic bomb survivors in Hiroshima and Nagasaki from specific cancers were evaluated on the basis of a basic k(age) x c(dose) x 2 contingency table. From the aspects of application and fits of different models, analysis based on the additive logit model was applied to the mortality experience of this population during the 22year period from 1 Oct. 1950 to 31 Dec. 1972. The advantages and disadvantages of the additive logit model were demonstrated. Leukemia mortality showed a sharp rise with an increase in dose. The dose response relationship suggests a possible curvature or a log linear model, particularly if the dose estimated to be more than 600 rad were set arbitrarily at 600 rad, since the average dose in the 200+ rad group would then change from 434 to 350 rad. In the 22year period from 1950 to 1972, a high mortality risk due to radiation was observed in survivors with doses of 200 rad and over for all cancers except leukemia. On the other hand, during the latest period from 1965 to 1972 a significant risk was noted also for stomach and breast cancers. Survivors who were 9 year old or less at the time of the bomb and who were exposed to high doses of 200+ rad appeared to show a high mortality risk for all cancers except leukemia, although the number of observed deaths is yet small. A number of interesting areas are discussed from the statistical and epidemiological standpoints, i.e., the numerical comparison of risks in various models, the general evaluation of cancer mortality by the additive logit model, the dose response relationship, the relative risk in the high dose group, the time period of radiation induced cancer mortality, the difference of dose response between Hiroshima and Nagasaki and the relative biological effectiveness of neutrons.

  7. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  8. Conducting tests for statistically significant differences using forest inventory data

    Science.gov (United States)

    James A. Westfall; Scott A. Pugh; John W. Coulston

    2013-01-01

    Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...

  9. The dynamics of the H(+) + D(2) reaction: a comparison of quantum mechanical wavepacket, quasi-classical and statistical-quasi-classical results.

    Science.gov (United States)

    Jambrina, P G; Aoiz, F J; Bulut, N; Smith, Sean C; Balint-Kurti, G G; Hankel, M

    2010-02-07

    A detailed study of the proton exchange reaction H(+) + D(2)(v = 0, j = 0) --> HD + D(+) on its ground 1(1)A' potential energy surface has been carried out using 'exact' close-coupled quantum mechanical wavepacket (WP-EQM), quasi-classical trajectory (QCT), and statistical quasi-classical trajectory (SQCT) calculations for a range of collision energies starting from the reaction threshold to 1.3 eV. The WP-EQM calculations include all total angular momenta up to J(max) = 50, and therefore the various dynamical observables are converged up to 0.6 eV. It has been found that it is necessary to include all Coriolis couplings to obtain reliable converged results. Reaction probabilities obtained using the different methods are thoroughly compared as a function of the total energy for a series of J values. Comparisons are also made of total reaction cross sections as function of the collision energy, and rate constants. In addition, opacity functions, integral cross sections (ICS) and differential cross sections (DCS) are presented at 102 meV, 201.3 meV and 524.6 meV collision energy. The agreement between the three sets of results is only qualitative. The QCT calculations fail to describe the overall reactivity and most of the dynamical observables correctly. At low collision energies, the QCT method is plagued by the lack of conservation of zero point energy, whilst at higher collision energies and/or total angular momenta, the appearance of an effective repulsive potential associated with the centrifugal motion "over" the well causes a substantial decrease of the reactivity. In turn, the statistical models overestimate the reactivity over the whole range of collision energies as compared with the WP-EQM method. Specifically, at sufficiently high collision energies the reaction cannot be deemed to be statistical and important dynamical effects seem to be present. In general the WP-EQM results lie in between those obtained using the QCT and SQCT methods. One of the main

  10. A statistical comparison of cirrus particle size distributions measured using the 2-D stereo probe during the TC4, SPARTICUS, and MACPEX flight campaigns with historical cirrus datasets

    Directory of Open Access Journals (Sweden)

    M. C. Schwartz

    2017-08-01

    Full Text Available This paper addresses two straightforward questions. First, how similar are the statistics of cirrus particle size distribution (PSD datasets collected using the Two-Dimensional Stereo (2D-S probe to cirrus PSD datasets collected using older Particle Measuring Systems (PMS 2-D Cloud (2DC and 2-D Precipitation (2DP probes? Second, how similar are the datasets when shatter-correcting post-processing is applied to the 2DC datasets? To answer these questions, a database of measured and parameterized cirrus PSDs – constructed from measurements taken during the Small Particles in Cirrus (SPARTICUS; Mid-latitude Airborne Cirrus Properties Experiment (MACPEX; and Tropical Composition, Cloud, and Climate Coupling (TC4 flight campaigns – is used.Bulk cloud quantities are computed from the 2D-S database in three ways: first, directly from the 2D-S data; second, by applying the 2D-S data to ice PSD parameterizations developed using sets of cirrus measurements collected using the older PMS probes; and third, by applying the 2D-S data to a similar parameterization developed using the 2D-S data themselves. This is done so that measurements of the same cloud volumes by parameterized versions of the 2DC and 2D-S can be compared with one another. It is thereby seen – given the same cloud field and given the same assumptions concerning ice crystal cross-sectional area, density, and radar cross section – that the parameterized 2D-S and the parameterized 2DC predict similar distributions of inferred shortwave extinction coefficient, ice water content, and 94 GHz radar reflectivity. However, the parameterization of the 2DC based on uncorrected data predicts a statistically significantly higher number of total ice crystals and a larger ratio of small ice crystals to large ice crystals than does the parameterized 2D-S. The 2DC parameterization based on shatter-corrected data also predicts statistically different numbers of ice crystals than does the

  11. A statistical comparison of cirrus particle size distributions measured using the 2-D stereo probe during the TC4, SPARTICUS, and MACPEX flight campaigns with historical cirrus datasets

    Science.gov (United States)

    Schwartz, M. Christian

    2017-08-01

    This paper addresses two straightforward questions. First, how similar are the statistics of cirrus particle size distribution (PSD) datasets collected using the Two-Dimensional Stereo (2D-S) probe to cirrus PSD datasets collected using older Particle Measuring Systems (PMS) 2-D Cloud (2DC) and 2-D Precipitation (2DP) probes? Second, how similar are the datasets when shatter-correcting post-processing is applied to the 2DC datasets? To answer these questions, a database of measured and parameterized cirrus PSDs - constructed from measurements taken during the Small Particles in Cirrus (SPARTICUS); Mid-latitude Airborne Cirrus Properties Experiment (MACPEX); and Tropical Composition, Cloud, and Climate Coupling (TC4) flight campaigns - is used.Bulk cloud quantities are computed from the 2D-S database in three ways: first, directly from the 2D-S data; second, by applying the 2D-S data to ice PSD parameterizations developed using sets of cirrus measurements collected using the older PMS probes; and third, by applying the 2D-S data to a similar parameterization developed using the 2D-S data themselves. This is done so that measurements of the same cloud volumes by parameterized versions of the 2DC and 2D-S can be compared with one another. It is thereby seen - given the same cloud field and given the same assumptions concerning ice crystal cross-sectional area, density, and radar cross section - that the parameterized 2D-S and the parameterized 2DC predict similar distributions of inferred shortwave extinction coefficient, ice water content, and 94 GHz radar reflectivity. However, the parameterization of the 2DC based on uncorrected data predicts a statistically significantly higher number of total ice crystals and a larger ratio of small ice crystals to large ice crystals than does the parameterized 2D-S. The 2DC parameterization based on shatter-corrected data also predicts statistically different numbers of ice crystals than does the parameterized 2D-S, but the

  12. Statistically-Based Comparison of the Removal Efficiencies and Resilience Capacities between Conventional and Natural Wastewater Treatment Systems: A Peak Load Scenario

    Directory of Open Access Journals (Sweden)

    Long Ho

    2018-03-01

    Full Text Available Emerging global threats, such as climate change, urbanization and water depletion, are driving forces for finding a feasible substitute for low cost-effective conventional activated sludge (AS technology. On the other hand, given their low cost and easy operation, nature-based systems such as constructed wetlands (CWs and waste stabilization ponds (WSPs appear to be viable options. To examine these systems, a 210-day experiment with 31 days of peak load scenario was performed. Particularly, we conducted a deliberate strategy of experimentation, which includes applying a preliminary study, preliminary models, hypothetical tests and power analysis to compare their removal efficiencies and resilience capacities. In contrast to comparable high removal efficiencies of organic matter—around 90%—both natural systems showed moderate nutrient removal efficiencies, which inferred the necessity for further treatment to ensure their compliance with environmental standards. During the peak period, the pond treatment systems appeared to be the most robust as they indicated a higher strength to withstanding the organic matter and nitrogen shock load and were able to recover within a short period. However, high demand of land—2.5 times larger than that of AS—is a major concern of the applicability of WSPs despite their lower operation and maintenance (O&M costs. It is also worth noting that initial efforts on systematic experimentation appeared to have an essential impact on ensuring statistically and practically meaningful results in this comparison study.

  13. Negotiating end-of-life decision making: a comparison of Japanese and U.S. residents' approaches.

    Science.gov (United States)

    Gabbay, Baback B; Matsumura, Shinji; Etzioni, Shiri; Asch, Steven M; Rosenfeld, Kenneth E; Shiojiri, Toshiaki; Balingit, Peter P; Lorenz, Karl A

    2005-07-01

    To compare Japanese and U.S. resident physicians' attitudes, clinical experiences, and emotional responses regarding making disclosures to patients facing incurable illnesses. From September 2003 to June 2004, the authors used a ten-item self-administered anonymous questionnaire in a cross-sectional survey of 103 internal medicine residents at two U.S. sites in Los Angeles, California, and 244 general medical practice residents at five Japanese sites in Central Honshu, Kyushu, Okinawa, Japan. The Japanese residents were more likely to favor including the family in disclosing diagnosis (95% versus 45%, pguilt about these behaviors. The residents' approaches to end-of-life decision making reflect known cultural preferences related to the role of patients and their families. Although Japanese trainees were more likely to endorse the role of the family, they expressed greater uncertainty about their approach. Difficulty and uncertainty in end-of-life decision making were common among both the Japanese and U.S. residents. Both groups would benefit from ethical training to negotiate diverse, changing norms regarding end-of-life decision making.

  14. Decision-making in follow-up after endovascular aneurysm repair based on diameter and volume measurements : a blinded comparison

    NARCIS (Netherlands)

    Prinssen, M; Verhoeven, ELG; Verhagen, HJM; Blankensteijn, JD

    Objective: to assess whether volume, in addition to diameter, measurements facilitate decision-making after endovascular aneurysm repair (EVAR). Material/Methods: patients (n = 82) with an immediately post-EVAR, and at least one follow-up (3-60 months), computed tomographic angiogram (CTA) were

  15. Cancer treatment decision-making among young adults with lung and colorectal cancer: a comparison with adults in middle age.

    Science.gov (United States)

    Mack, Jennifer W; Cronin, Angel; Fasciano, Karen; Block, Susan D; Keating, Nancy L

    2016-09-01

    Our aim is to understand experiences with treatment decision-making among young adults with cancer. We studied patients with lung cancer or colorectal cancer in the Cancer Care Outcomes Research and Surveillance Consortium, a prospective cohort study. We identified 148 young adult patients aged 21-40 years who completed baseline interview questions about cancer treatment decision-making; each was propensity score matched to three middle adult patients aged 41-60 years, for a cohort of 592 patients. Patients were asked about decision-making preferences, family involvement in decision-making, and worries about treatment. An ordinal logistic regression model evaluated factors associated with more treatment worries. Young and middle-aged adults reported similar decision-making preferences (p = 0.80) and roles relative to physicians (p = 0.36). Although family involvement was similar in the age groups (p = 0.21), young adults were more likely to have dependent children in the home (60% younger versus 28% middle-aged adults, p Young adults reported more worries about time away from family (p = 0.002), and, in unadjusted analyses, more cancer treatment-related worries (mean number of responses of 'somewhat' or 'very' worried 2.5 for younger versus 2.2 for middle-aged adults, p = 0.02.) However, in adjusted analyses, worries were associated with the presence of dependent children in the home (odds ratio [OR] 1.55, 95% CI = 1.07-2.24, p = 0.02), rather than age. Young adults involve doctors and family members in decisions at rates similar to middle-aged adults but experience more worries about time away from family. Patients with dependent children are especially likely to experience worries. Treatment decision-making strategies should be based on individual preferences and needs rather than age alone. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Impaired decision-making and impulse control in Internet gaming addicts: evidence from the comparison with recreational Internet game users.

    Science.gov (United States)

    Wang, Yifan; Wu, Lingdan; Wang, Lingxiao; Zhang, Yifen; Du, Xiaoxia; Dong, Guangheng

    2017-11-01

    Although Internet games have been proven to be addictive, only a few game players develop online gaming addiction. A large number of players play online games recreationally without being addicted to it. These individuals are defined as recreational Internet gaming users (RGU). So far, no research has investigated decision-making and impulse control in RGU. In the current study, we used delay discounting (DD) task and probabilistic discounting (PD) task to examine decision-making and impulse control in 20 healthy controls, 20 subjects with Internet gaming disorder (IGD) and 23 RGU during fMRI scanning. At the behavioral level, RGU showed lower DD rate and higher PD rate than subjects with IGD and there was no significant difference between RGU and healthy controls on the DD and PD rates. At the neural level, RGU showed increased neural response in the parahippocampal gyrus, the anterior cingulate cortex, the medial frontal gyrus and the inferior frontal gyrus as compared with subjects with IGD. These brain regions may play an important role in preventing RGU from developing addiction. The results suggest that the RGU are capable of inhibiting impulse due to additional cognitive endeavor and the subjects with IGD have deficit in decision-making and impulsive control, which are associated with brain dysfunction. © 2016 Society for the Study of Addiction.

  17. Characterization of microbial associations with methanotrophic archaea and sulfate-reducing bacteria through statistical comparison of nested Magneto-FISH enrichments

    Directory of Open Access Journals (Sweden)

    Elizabeth Trembath-Reichert

    2016-04-01

    Full Text Available Methane seep systems along continental margins host diverse and dynamic microbial assemblages, sustained in large part through the microbially mediated process of sulfate-coupled Anaerobic Oxidation of Methane (AOM. This methanotrophic metabolism has been linked to consortia of anaerobic methane-oxidizing archaea (ANME and sulfate-reducing bacteria (SRB. These two groups are the focus of numerous studies; however, less is known about the wide diversity of other seep associated microorganisms. We selected a hierarchical set of FISH probes targeting a range of Deltaproteobacteria diversity. Using the Magneto-FISH enrichment technique, we then magnetically captured CARD-FISH hybridized cells and their physically associated microorganisms from a methane seep sediment incubation. DNA from nested Magneto-FISH experiments was analyzed using Illumina tag 16S rRNA gene sequencing (iTag. Enrichment success and potential bias with iTag was evaluated in the context of full-length 16S rRNA gene clone libraries, CARD-FISH, functional gene clone libraries, and iTag mock communities. We determined commonly used Earth Microbiome Project (EMP iTAG primers introduced bias in some common methane seep microbial taxa that reduced the ability to directly compare OTU relative abundances within a sample, but comparison of relative abundances between samples (in nearly all cases and whole community-based analyses were robust. The iTag dataset was subjected to statistical co-occurrence measures of the most abundant OTUs to determine which taxa in this dataset were most correlated across all samples. Many non-canonical microbial partnerships were statistically significant in our co-occurrence network analysis, most of which were not recovered with conventional clone library sequencing, demonstrating the utility of combining Magneto-FISH and iTag sequencing methods for hypothesis generation of associations within complex microbial communities. Network analysis pointed to

  18. Characterization of microbial associations with methanotrophic archaea and sulfate-reducing bacteria through statistical comparison of nested Magneto-FISH enrichments.

    Science.gov (United States)

    Trembath-Reichert, Elizabeth; Case, David H; Orphan, Victoria J

    2016-01-01

    Methane seep systems along continental margins host diverse and dynamic microbial assemblages, sustained in large part through the microbially mediated process of sulfate-coupled Anaerobic Oxidation of Methane (AOM). This methanotrophic metabolism has been linked to consortia of anaerobic methane-oxidizing archaea (ANME) and sulfate-reducing bacteria (SRB). These two groups are the focus of numerous studies; however, less is known about the wide diversity of other seep associated microorganisms. We selected a hierarchical set of FISH probes targeting a range of Deltaproteobacteria diversity. Using the Magneto-FISH enrichment technique, we then magnetically captured CARD-FISH hybridized cells and their physically associated microorganisms from a methane seep sediment incubation. DNA from nested Magneto-FISH experiments was analyzed using Illumina tag 16S rRNA gene sequencing (iTag). Enrichment success and potential bias with iTag was evaluated in the context of full-length 16S rRNA gene clone libraries, CARD-FISH, functional gene clone libraries, and iTag mock communities. We determined commonly used Earth Microbiome Project (EMP) iTAG primers introduced bias in some common methane seep microbial taxa that reduced the ability to directly compare OTU relative abundances within a sample, but comparison of relative abundances between samples (in nearly all cases) and whole community-based analyses were robust. The iTag dataset was subjected to statistical co-occurrence measures of the most abundant OTUs to determine which taxa in this dataset were most correlated across all samples. Many non-canonical microbial partnerships were statistically significant in our co-occurrence network analysis, most of which were not recovered with conventional clone library sequencing, demonstrating the utility of combining Magneto-FISH and iTag sequencing methods for hypothesis generation of associations within complex microbial communities. Network analysis pointed to many co

  19. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  20. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  1. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  2. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  3. Making a difference? A comparison between multi-sensory and regular storytelling for persons with profound intellectual and multiple disabilities.

    Science.gov (United States)

    Ten Brug, A; Van der Putten, A A J; Penne, A; Maes, B; Vlaskamp, C

    2016-11-01

    Multi-sensory storytelling (MSST) was developed to include persons with profound intellectual and multiple disabilities in storytelling culture. In order to increase the listeners' attention, MSST stories are individualised and use multiple sensory stimuli to support the verbal text. In order to determine the value of MSST, this study compared listeners' attention under two conditions: (1) being read MSST books and (2) being read regular stories. A non-randomised control study was executed in which the intervention group read MSST books (n = 45) and a comparison group (n = 31) read regular books. Books were read 10 times during a 5-week period. The 1st, 5th and 10th storytelling sessions were recorded on video in both groups, and the percentage of attention directed to the book and/or stimuli and to the storyteller was scored by a trained and independent rater. Two repeated measure analyses (with the storytelling condition as a between-subject factor and the three measurements as factor) were performed to determine the difference between the groups in terms of attention directed to the book/stimuli (first analysis) and storyteller (second analysis). A further analysis established whether the level of attention changed between the reading sessions and whether there was an interaction effect between the repetition of the book and the storytelling condition. The attention directed to the book and/or the stimuli was significantly higher in the MSST group than in the comparison group. No significant difference between the two groups was found in the attention directed to the storyteller. For MSST stories, most attention was observed during the fifth reading session, while for regular stories, the fifth session gained least attentiveness from the listener. The persons with profound intellectual and multiple disabilities paid more attention to the book and/or stimuli in the MSST condition compared with the regular story telling group. Being more attentive towards

  4. Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end.

    Science.gov (United States)

    Shu, Lisa L; Mazar, Nina; Gino, Francesca; Ariely, Dan; Bazerman, Max H

    2012-09-18

    Many written forms required by businesses and governments rely on honest reporting. Proof of honest intent is typically provided through signature at the end of, e.g., tax returns or insurance policy forms. Still, people sometimes cheat to advance their financial self-interests-at great costs to society. We test an easy-to-implement method to discourage dishonesty: signing at the beginning rather than at the end of a self-report, thereby reversing the order of the current practice. Using laboratory and field experiments, we find that signing before-rather than after-the opportunity to cheat makes ethics salient when they are needed most and significantly reduces dishonesty.

  5. Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end

    Science.gov (United States)

    Shu, Lisa L.; Mazar, Nina; Gino, Francesca; Ariely, Dan; Bazerman, Max H.

    2012-01-01

    Many written forms required by businesses and governments rely on honest reporting. Proof of honest intent is typically provided through signature at the end of, e.g., tax returns or insurance policy forms. Still, people sometimes cheat to advance their financial self-interests—at great costs to society. We test an easy-to-implement method to discourage dishonesty: signing at the beginning rather than at the end of a self-report, thereby reversing the order of the current practice. Using laboratory and field experiments, we find that signing before—rather than after—the opportunity to cheat makes ethics salient when they are needed most and significantly reduces dishonesty. PMID:22927408

  6. Comparison of deck- and trial-based approaches to advantageous decision making on the Iowa Gambling Task.

    Science.gov (United States)

    Visagan, Ravindran; Xiang, Ally; Lamar, Melissa

    2012-06-01

    We compared the original deck-based model of advantageous decision making assessed with the Iowa Gambling Task (IGT) with a trial-based approach across behavioral and physiological outcomes in 33 younger adults (15 men, 18 women; 22.2 ± 3.7 years of age). One administration of the IGT with simultaneous measurement of skin conductance responses (SCRs) was performed and the two methods applied: (a) the original approach of subtracting disadvantageous picks of Decks A and B from advantageous picks of Decks C and D and (b) a trial-based approach focused on the financial outcome for each deck leading up to the trial in question. When directly compared, the deck-based approach resulted in a more advantageous behavioral profile than did the trial-based approach. Analysis of SCR data revealed no significant differences between methods for physiological measurements of SCR fluctuations or anticipatory responses to disadvantageous picks. Post hoc investigation of the trial-based method revealed Deck B contributed to both advantageous and disadvantageous decision making for the majority of participants. When divided by blocks of 20, the number of advantageous to disadvantageous choices reversed as the task progressed despite the total number of picks from Deck B remaining high. SCR fluctuations for Deck B, although not significantly different from the other decks, did show a sharp decline after the first block of 20 and remained below levels for Decks C and D toward the end of the task, suggesting that participants may have gained knowledge of the frequency of loss for this deck. (c) 2012 APA, all rights reserved

  7. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  8. Risk-informed decision making in the nuclear industry: Application and effectiveness comparison of different genetic algorithm techniques

    International Nuclear Information System (INIS)

    Gjorgiev, Blaže; Kančev, Duško; Čepin, Marko

    2012-01-01

    Highlights: ► Multi-objective optimization of STI based on risk-informed decision making. ► Four different genetic algorithms (GAs) techniques are used as optimization tool. ► Advantages/disadvantages among the four different GAs applied are emphasized. - Abstract: The risk-informed decision making (RIDM) process, where insights gained from the probabilistic safety assessment are contemplated together with other engineering insights, is gaining an ever-increasing attention in the process industries. Increasing safety systems availability by applying RIDM is one of the prime goals for the authorities operating with nuclear power plants. Additionally, equipment ageing is gradually becoming a major concern in the process industries and especially in the nuclear industry, since more and more safety-related components are approaching or are already in their wear-out phase. A significant difficulty regarding the consideration of ageing effects on equipment (un)availability is the immense uncertainty the available equipment ageing data are associated to. This paper presents an approach for safety system unavailability reduction by optimizing the related test and maintenance schedule suggested by the technical specifications in the nuclear industry. Given the RIDM philosophy, two additional insights, i.e. ageing data uncertainty and test and maintenance costs, are considered along with unavailability insights gained from the probabilistic safety assessment for a selected standard safety system. In that sense, an approach for multi-objective optimization of the equipment surveillance test interval is proposed herein. Three different objective functions related to each one of the three different insights discussed above comprise the multi-objective nature of the optimization process. Genetic algorithm technique is utilized as an optimization tool. Four different types of genetic algorithms are utilized and consequently comparative analysis is conducted given the

  9. Pressure ulcer risk assessment and prevention: what difference does a risk scale make? A comparison between Norway and Ireland.

    Science.gov (United States)

    Johansen, E; Moore, Z; van Etten, M; Strapp, H

    2014-07-01

    To explore similarities and differences in nurses' views on risk assessment practices and preventive care activities in a context where patients' risk of developing pressure ulcers is assessed using clinical judgment (Norway) and a context where patients' risk of developing pressure ulcers is assessed using a formal structured risk assessment combined with clinical judgement (Ireland). A descriptive, qualitative design was employed across two different care settings with a total of 14 health care workers, nine from Norway and five from Ireland. Regardless of whether risk assessment was undertaken using clinical judgment or formal structured risk assessment, identified risk factors, at risk patients and appropriate preventive initiatives discussed by participant were similar across care settings. Furthermore, risk assessment did not necessarily result in the planning and implementation of appropriate pressure ulcer prevention initiatives. Thus, in this instance, use of a formal risk assessment tool does not seem to make any difference to the planning, initiation and evaluation of pressure ulcer prevention strategies. Regardless of the method of risk assessment, patients at risk of developing pressure ulcers are detected, suggesting that the practice of risk assessment should be re-evaluated. Moreover, appropriate preventive interventions were described. However, the missing link between risk assessment and documented care planning is of concern and barriers to appropriate pressure ulcer documentation should be explored further. This work is partly funded by a research grant from the Norwegian Nurses Organisation (NNO) (Norsk Sykepleierforbund NSF) in 2012. The authors have no conflict of interest to declare.

  10. Engaging Diverse Students in Statistical Inquiry: A Comparison of Learning Experiences and Outcomes of Under-Represented and Non-Underrepresented Students Enrolled in a Multidisciplinary Project-Based Statistics Course

    Science.gov (United States)

    Dierker, Lisa; Alexander, Jalen; Cooper, Jennifer L.; Selya, Arielle; Rose, Jennifer; Dasgupta, Nilanjana

    2016-01-01

    Introductory statistics needs innovative, evidence-based teaching practices that support and engage diverse students. To evaluate the success of a multidisciplinary, project-based course, we compared experiences of under-represented (URM) and non-underrepresented students in 4 years of the course. While URM students considered the material more…

  11. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  12. Statistical energy as a tool for binning-free, multivariate goodness-of-fit tests, two-sample comparison and unfolding

    International Nuclear Information System (INIS)

    Aslan, B.; Zech, G.

    2005-01-01

    We introduce the novel concept of statistical energy as a statistical tool. We define statistical energy of statistical distributions in a similar way as for electric charge distributions. Charges of opposite sign are in a state of minimum energy if they are equally distributed. This property is used to check whether two samples belong to the same parent distribution, to define goodness-of-fit tests and to unfold distributions distorted by measurement. The approach is binning-free and especially powerful in multidimensional applications

  13. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  14. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  15. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  16. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  17. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  18. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  19. Evaluation of the Trail Making Test and interval timing as measures of cognition in healthy adults: comparisons by age, education, and gender.

    Science.gov (United States)

    Płotek, Włodzimierz; Łyskawa, Wojciech; Kluzik, Anna; Grześkowiak, Małgorzata; Podlewski, Roland; Żaba, Zbigniew; Drobnik, Leon

    2014-02-03

    Human cognitive functioning can be assessed using different methods of testing. Age, level of education, and gender may influence the results of cognitive tests. The well-known Trail Making Test (TMT), which is often used to measure the frontal lobe function, and the experimental test of Interval Timing (IT) were compared. The methods used in IT included reproduction of auditory and visual stimuli, with the subsequent production of the time intervals of 1-, 2-, 5-, and 7-seconds durations with no pattern. Subjects included 64 healthy adult volunteers aged 18-63 (33 women, 31 men). Comparisons were made based on age, education, and gender. TMT was performed quickly and was influenced by age, education, and gender. All reproduced visual and produced intervals were shortened and the reproduction of auditory stimuli was more complex. Age, education, and gender have more pronounced impact on the cognitive test than on the interval timing test. The reproduction of the short auditory stimuli was more accurate in comparison to other modalities used in the IT test. The interval timing, when compared to the TMT, offers an interesting possibility of testing. Further studies are necessary to confirm the initial observation.

  20. A comparison of processing load during non-verbal decision-making in two individuals with aphasia

    Directory of Open Access Journals (Sweden)

    Salima Suleman

    2015-05-01

    Full Text Available INTRODUCTION A growing body of evidence suggests people with aphasia (PWA can have impairments to cognitive functions such as attention, working memory and executive functions.(1-5 Such cognitive impairments have been shown to negatively affect the decision-making (DM abilities adults with neurological damage. (6,7 However, little is known about DM abilities of PWA.(8 Pupillometry is “the measurement of changes in pupil diameter”.(9;p.1 Researchers have reported a positive relationship between processing load and phasic pupil size (i.e., as processing load increases, pupil size increases.(10 Thus pupillometry has the potential to be a useful tool for investigating processing load during DM in PWA. AIMS The primary aim of this study was to establish the feasibility of using pupillometry during a non-verbal DM task with PWA. The secondary aim was to explore non-verbal DM performance in PWA and determine the relationship between DM performance and processing load using pupillometry. METHOD DESIGN. A single-subject case-study design with two participants was used in this study. PARTICIPANTS. Two adult males with anomic aphasia participated in this study. Participants were matched for age and education. Both participants were independent, able to drive, and had legal autonomy. MEASURES. PERFORMANCE ON A DM TASK. We used a computerized risk-taking card game called the Iowa Gambling Task (IGT as our non-verbal DM task.(11 In the IGT, participants made 100 selections (via eye gaze from four decks of cards presented on the computer screen with the goal of maximizing their overall hypothetical monetary gain. PROCESSING LOAD. The EyeLink 1000+ eye tracking system was used to collect pupil size measures while participants deliberated before each deck selection during the IGT. For this analysis, we calculated change in pupil size as a measure of processing load. RESULTS P1. P1 made increasingly advantageous decisions as the task progressed (Fig.1. When

  1. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  2. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  3. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  4. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  5. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  6. Understanding Statistics and Statistics Education: A Chinese Perspective

    Science.gov (United States)

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  7. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  8. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  9. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  10. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  11. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  12. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  13. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  14. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  15. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  16. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  17. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  18. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  19. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  20. Isotopic safeguards statistics

    International Nuclear Information System (INIS)

    Timmerman, C.L.; Stewart, K.B.

    1978-06-01

    The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results

  1. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  2. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  3. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  4. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  5. Statistical learning and prejudice.

    Science.gov (United States)

    Madison, Guy; Ullén, Fredrik

    2012-12-01

    Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.

  6. Mapping cell populations in flow cytometry data for cross‐sample comparison using the Friedman–Rafsky test statistic as a distance measure

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu

    2015-01-01

    Abstract Flow cytometry (FCM) is a fluorescence‐based single‐cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap‐FR, a novel method for cell population mapping across FCM samples. FlowMap‐FR is based on the Friedman–Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap‐FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap‐FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap‐FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap‐FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap‐FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback–Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL‐distance in distinguishing

  7. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell

  8. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  9. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  10. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  11. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  12. Cross-site comparison of land-use decision-making and its consequences across land systems with a generalized agent-based model.

    Science.gov (United States)

    Magliocca, Nicholas R; Brown, Daniel G; Ellis, Erle C

    2014-01-01

    Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM) as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model) landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model enhancement.

  13. Cross-site comparison of land-use decision-making and its consequences across land systems with a generalized agent-based model.

    Directory of Open Access Journals (Sweden)

    Nicholas R Magliocca

    Full Text Available Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model

  14. Optical coherence tomography-based decision making in exudative age-related macular degeneration: comparison of time- vs spectral-domain devices.

    Science.gov (United States)

    Cukras, C; Wang, Y D; Meyerle, C B; Forooghian, F; Chew, E Y; Wong, W T

    2010-05-01

    To determine whether optical coherence tomography (OCT) device-type influences clinical grading of OCT imaging in the context of exudative age-related macular degeneration (AMD). Ninety-six paired OCT scans from 49 patients with active exudative AMD were obtained on both the time-domain Stratus OCT system and the spectral-domain Cirrus OCT system at the same visit. Three independent graders judged each scan for the presence of intraretinal fluid (IRF) or subretinal fluid (SRF). The degree of grader consensus was evaluated and the ability of the systems to detect the presence of disease activity was analysed. Cirrus OCT generated a higher degree of inter-grader consensus than Stratus OCT with higher intraclass correlation coefficients for all parameters analysed. A pair-wise comparison of Cirrus OCT with Stratus OCT systems revealed that Cirrus-based gradings more frequently reported the presence of SRF and IRF and detected overall neovascular activity at a higher rate (P<0.05) compared with Stratus-based gradings. The choice of time-domain (Stratus) vs spectra-domain (Cirrus) OCT systems has a measurable impact on clinical decision making in exudative AMD. Spectral-domain OCT systems may be able to generate more consensus in clinical interpretation and, in particular cases, detect disease activity not detected by time-domain systems. Clinical trials using OCT-based clinical evaluations of exudative AMD may need to account for these inter-system differences in planning and analysis.

  15. The value of time-to-onset in statistical signal detection of adverse drug reactions : a comparison with disproportionality analysis in spontaneous reports from the Netherlands

    NARCIS (Netherlands)

    Scholl, Joep H G; van Puijenbroek, Eugène P

    2016-01-01

    PURPOSE: In pharmacovigilance, the commonly used disproportionality analysis (DPA) in statistical signal detection is known to have its limitations. The aim of this study was to investigate the value of the time to onset (TTO) of ADRs in addition to DPA. METHODS: We performed a pilot study using

  16. Big-fish-little-pond social comparison and local dominance effects : Integrating new statistical models, methodology, design, theory and substantive implications

    NARCIS (Netherlands)

    Marsh, Herbert W.; Kuyper, Hans; Morin, Alexandre J. S.; Philip, D. Parker; Seaton, Marjorie

    2014-01-01

    We offer new theoretical, substantive, statistical, design, and methodological insights into the seemingly paradoxical negative effects of school- and class-average achievement (ACH) on academic self-concept (ASC) the big-fish-little-pond-effect (BFLPE; 15,356 Dutch 9th grade students from 651

  17. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  18. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  19. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  20. Statistics Using Just One Formula

    Science.gov (United States)

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  1. PI-3 correlations and statistical evaluation results

    International Nuclear Information System (INIS)

    Pernica, R.; Cizek, J.

    1992-01-01

    Empirical Critical Heat Flux (CHF) correlations PI-3 having the widest range of validity for flow conditions in both hexagonal and square rod bundle geometries and compared with published CHF correlations are presented. They are valid for vertical water upflow through rod bundles with relatively wide and very tight rod lattices, and include axial and radial non-uniform heating. The correlations were developed with the use of more than 6000 data obtained from 119 electrically heated rod bundles. Comprehensive results of statistical evaluations of the new correlations are presented for various data bases. Also presented is a comparison of statistical evaluations of several well-known CHF correlations in the experimental data base used. A procedure which makes it possible to directly determine the probability that CHF does not occur is described for the purpose of nuclear safety assessment. (author) 8 tabs., 32 figs., 11 refs

  2. A note on the statistical analysis of point judgment matrices

    Directory of Open Access Journals (Sweden)

    MG Kabera

    2013-06-01

    Full Text Available The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment matrix. In the present paper some statistical approaches to extracting the weights of objects from a judgment matrix are reviewed and new ideas which are rooted in the traditional method of paired comparisons are introduced.

  3. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  4. Comparison of ESWL outcome between Wolf Piezolith 3000 vs. storz Modulith Slk. it is the man who makes efficient the machine.

    Science.gov (United States)

    Albino, Giuseppe; Marucco, Ettore Cirillo

    2012-12-01

    The choice of an extracorporeal lithotripter for extracorporeal shock wave lithotripsy (ESWL) of urinary stones must be done with efficacy criteria. It is easy to demonstrate the advantages of the third-generation lithotripters compared to previous generations of lithotripters. This study has the purpose of evaluate whether it is possible to establish differences in effectiveness between two third generation lithotripters. We report about the last 100 ESWL treatments carried out with the Wolf Piezolith 3000 and the last 100 with the Storz Modulith SLK, performed by the same single operator. Stones were stratified by site and size. Comparison was made considering the number of shock waves per session and the number of sessions and the outcomes. The results showed no statistically significant differences. In fact, the cumulative stone-free rate was 93% for the Wolf Piezolith and 91% for the Storz Modulith. The technical differences between lithotripters concern the energy delivered, the shape of the acoustic focus, the depth of focus, the coupling surface, the mobility of the head, the alignment mode and the simultaneous use of ultrasonography and radiological pointing. These differences are never obtained from published series. Furthermore data of patients and stones that have not been treated for the difficulty of stone targeting due to the depth of acoustic focus (high BMI), the limited inclination of the head or to the patient intolerance to shock waves are not usually reported. The results obtained by the same operator are comparable even when are obtained with different machines of the same generation. The real differences could arise if we would take into account also the patients which were excluded from evaluation because, for the same generation of lithotripters, the results depend on the operator, while the eligibility to treatment of the patient depends on the characteristics of the machine.

  5. Comparison of U-spatial statistics and C-A fractal models for delineating anomaly patterns of porphyry-type Cu geochemical signatures in the Varzaghan district, NW Iran

    Science.gov (United States)

    Ghezelbash, Reza; Maghsoudi, Abbas

    2018-05-01

    The delineation of populations of stream sediment geochemical data is a crucial task in regional exploration surveys. In this contribution, uni-element stream sediment geochemical data of Cu, Au, Mo, and Bi have been subjected to two reliable anomaly-background separation methods, namely, the concentration-area (C-A) fractal and the U-spatial statistics methods to separate geochemical anomalies related to porphyry-type Cu mineralization in northwest Iran. The quantitative comparison of the delineated geochemical populations using the modified success-rate curves revealed the superiority of the U-spatial statistics method over the fractal model. Moreover, geochemical maps of investigated elements revealed strongly positive correlations between strong anomalies and Oligocene-Miocene intrusions in the study area. Therefore, follow-up exploration programs should focus on these areas.

  6. Analysing the spatial patterns of livestock anthrax in Kazakhstan in relation to environmental factors: a comparison of local (Gi* and morphology cluster statistics

    Directory of Open Access Journals (Sweden)

    Ian T. Kracalik

    2012-11-01

    Full Text Available We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle and small (sheep and goats domestic ruminants across Kazakhstan. The Getis-Ord (Gi* statistic and a multidirectional optimal ecotope algorithm (AMOEBA were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149 and for small ruminants (n = 9. In contrast, Gi* revealed fewer large ruminant clusters (n = 122 and more small ruminant clusters (n = 61. Significant environmental differences were found between groups using the Kruskall-Wallis and Mann- Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.

  7. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  8. Drug Adverse Event Detection in Health Plan Data Using the Gamma Poisson Shrinker and Comparison to the Tree-based Scan Statistic

    Directory of Open Access Journals (Sweden)

    David Smith

    2013-03-01

    Full Text Available Background: Drug adverse event (AE signal detection using the Gamma Poisson Shrinker (GPS is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan. Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds.

  9. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  10. A comparison of statistically optimized near field acoustic holography using single layer pressure velocity measurements and using double layer pressure measurements

    DEFF Research Database (Denmark)

    Jacobsen, Finn; Chen, Xinyi; Jaud, Virginie

    2008-01-01

    recently been suggested. An alternative method uses a double layer array of pressure transducers. Both methods make it possible to distinguish between sources on the two sides of the array and thus suppress the influence of extraneous noise and reflections coming from the “wrong” side. This letter compares...

  11. Caregiving Statistics

    Science.gov (United States)

    ... have had to make some adjustments to their work life, from reporting late to work to giving up ... it increases moral and 39% believe it increases productivity. Job-based Health Insurance in the Balance: Employer Views of Coverage in the Workplace. Collins, S. ...

  12. Optical Coherence Tomography-Based Decision Making in Exudative Age-related Macular Degeneration: Comparison of Time- versus Spectral-Domain Devices

    Science.gov (United States)

    Cukras, Catherine; Wang, Yunqing D.; Meyerle, Catherine B.; Forooghian, Farzin; Chew, Emily Y.; Wong, Wai T.

    2010-01-01

    Purpose To determine if optical coherence tomography (OCT) device-type influences clinical grading of OCT imaging in the context of exudative age-related macular degeneration (AMD). Methods Ninety-six paired OCT scans from 49 patients with active exudative AMD were obtained on both the time-domain Stratus™ OCT system and the spectral-domain Cirrus™ OCT system at the same visit. Three independent graders judged each scan for the presence of intraretinal fluid (IRF) or subretinal fluid (SRF). The degree of grader consensus was evaluated and the ability of the systems to detect the presence of disease activity was analyzed. Results Cirrus™ OCT generated a higher degree of inter-grader consensus than Stratus OCT with higher intraclass correlation coefficients (ICC) for all parameters analyzed. A pair-wise comparison of Cirrus™ OCT to Stratus™ OCT systems revealed that Cirrus™-based gradings more frequently reported the presence of SRF and IRF and detected overall neovascular activity at a higher rate (p<0.05) compared to Stratus™-based gradings Conclusions The choice of time-domain (Stratus™) versus spectra-domain (Cirrus™) OCT systems has a measurable impact on clinical decision making in exudative AMD. Spectral-domain OCT systems may be able to generate more consensus in clinical interpretation and, in particular cases, detect disease activity not detected by time-domain systems. Clinical trials employing OCT-based clinical evaluations of exudative AMD may need to account for these inter-system differences in planning and analysis. PMID:19696804

  13. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  14. Personal birth preferences and actual mode of delivery outcomes of obstetricians and gynaecologists in South West England; with comparison to regional and national birth statistics.

    Science.gov (United States)

    Lightly, Katie; Shaw, Elisabeth; Dailami, Narges; Bisson, Dina

    2014-10-01

    To determine personal birth preferences of obstetricians in various clinical scenarios, in particular elective caesarean section for maternal request. To determine actual rates of modes of deliveries amongst the same group. To compare the obstetrician's mode of delivery rates, to the general population. Following ethical approval, a piloted online survey link was sent via email to 242 current obstetricians and gynaecologists, (consultants and trainees) in South West England. Mode of delivery results were compared to regional and national population data, using Hospital Episode Statistics and subjected to statistical analysis. The response rate was 68%. 90% would hypothetically plan a vaginal delivery, 10% would consider a caesarean section in an otherwise uncomplicated primiparous pregnancy. Of the 94/165 (60%) respondents with children (201 children), mode of delivery for the first born child; normal vaginal delivery 48%, caesarean section 26.5% (elective 8.5%, emergency 18%), instrumental 24.5% and vaginal breech 1%. Only one chose an elective caesarean for maternal request. During 2006-2011 obstetricians have the same overall actual modes of birth as the population (p=0.9). Ten percent of obstetricians report they would consider requesting caesarean section for themselves/their partner, which is the lowest rate reported within UK studies. However only 1% actually had a caesarean solely for maternal choice. When compared to regional/national statistics obstetricians currently have modes of delivery that are not significantly different than the population and suggests that they choose non interventional delivery if possible. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Predictive capacity of a non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: Comparison of a cutoff approach and inferential statistics.

    Science.gov (United States)

    Kim, Da-Eun; Yang, Hyeri; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Choi, Jin Kyu; Jung, Mi-Sook; Jeon, Eun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Park, Jung Eun; Sohn, Soo Jung; Kim, Tae Sung; Ahn, Il Young; Jeong, Tae-Cheon; Lim, Kyung-Min; Bae, SeungJin

    2016-01-01

    In order for a novel test method to be applied for regulatory purposes, its reliability and relevance, i.e., reproducibility and predictive capacity, must be demonstrated. Here, we examine the predictive capacity of a novel non-radioisotopic local lymph node assay, LLNA:BrdU-FCM (5-bromo-2'-deoxyuridine-flow cytometry), with a cutoff approach and inferential statistics as a prediction model. 22 reference substances in OECD TG429 were tested with a concurrent positive control, hexylcinnamaldehyde 25%(PC), and the stimulation index (SI) representing the fold increase in lymph node cells over the vehicle control was obtained. The optimal cutoff SI (2.7≤cutoff <3.5), with respect to predictive capacity, was obtained by a receiver operating characteristic curve, which produced 90.9% accuracy for the 22 substances. To address the inter-test variability in responsiveness, SI values standardized with PC were employed to obtain the optimal percentage cutoff (42.6≤cutoff <57.3% of PC), which produced 86.4% accuracy. A test substance may be diagnosed as a sensitizer if a statistically significant increase in SI is elicited. The parametric one-sided t-test and non-parametric Wilcoxon rank-sum test produced 77.3% accuracy. Similarly, a test substance could be defined as a sensitizer if the SI means of the vehicle control, and of the low, middle, and high concentrations were statistically significantly different, which was tested using ANOVA or Kruskal-Wallis, with post hoc analysis, Dunnett, or DSCF (Dwass-Steel-Critchlow-Fligner), respectively, depending on the equal variance test, producing 81.8% accuracy. The absolute SI-based cutoff approach produced the best predictive capacity, however the discordant decisions between prediction models need to be examined further. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. A Simulation-Based Study on the Comparison of Statistical and Time Series Forecasting Methods for Early Detection of Infectious Disease Outbreaks.

    Science.gov (United States)

    Yang, Eunjoo; Park, Hyun Woo; Choi, Yeon Hwa; Kim, Jusim; Munkhdalai, Lkhagvadorj; Musa, Ibrahim; Ryu, Keun Ho

    2018-05-11

    Early detection of infectious disease outbreaks is one of the important and significant issues in syndromic surveillance systems. It helps to provide a rapid epidemiological response and reduce morbidity and mortality. In order to upgrade the current system at the Korea Centers for Disease Control and Prevention (KCDC), a comparative study of state-of-the-art techniques is required. We compared four different temporal outbreak detection algorithms: the CUmulative SUM (CUSUM), the Early Aberration Reporting System (EARS), the autoregressive integrated moving average (ARIMA), and the Holt-Winters algorithm. The comparison was performed based on not only 42 different time series generated taking into account trends, seasonality, and randomly occurring outbreaks, but also real-world daily and weekly data related to diarrhea infection. The algorithms were evaluated using different metrics. These were namely, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), F1 score, symmetric mean absolute percent error (sMAPE), root-mean-square error (RMSE), and mean absolute deviation (MAD). Although the comparison results showed better performance for the EARS C3 method with respect to the other algorithms, despite the characteristics of the underlying time series data, Holt⁻Winters showed better performance when the baseline frequency and the dispersion parameter values were both less than 1.5 and 2, respectively.

  17. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  18. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  19. Statistical methods and materials characterisation

    International Nuclear Information System (INIS)

    Wallin, K.R.W.

    2010-01-01

    Statistics is a wide mathematical area, which covers a myriad of analysis and estimation options, some of which suit special cases better than others. A comprehensive coverage of the whole area of statistics would be an enormous effort and would also be outside the capabilities of this author. Therefore, this does not intend to be a textbook on statistical methods available for general data analysis and decision making. Instead it will highlight a certain special statistical case applicable to mechanical materials characterization. The methods presented here do not in any way rule out other statistical methods by which to analyze mechanical property material data. (orig.)

  20. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  1. Wave Mechanics or Wave Statistical Mechanics

    International Nuclear Information System (INIS)

    Qian Shangwu; Xu Laizi

    2007-01-01

    By comparison between equations of motion of geometrical optics and that of classical statistical mechanics, this paper finds that there should be an analogy between geometrical optics and classical statistical mechanics instead of geometrical mechanics and classical mechanics. Furthermore, by comparison between the classical limit of quantum mechanics and classical statistical mechanics, it finds that classical limit of quantum mechanics is classical statistical mechanics not classical mechanics, hence it demonstrates that quantum mechanics is a natural generalization of classical statistical mechanics instead of classical mechanics. Thence quantum mechanics in its true appearance is a wave statistical mechanics instead of a wave mechanics.

  2. Causality Statistical Perspectives and Applications

    CERN Document Server

    Berzuini, Carlo; Bernardinell, Luisa

    2012-01-01

    A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr

  3. Comparison of wear behaviour and mechanical properties of as-cast Al6082 and Al6082-T6 using statistical analysis

    Science.gov (United States)

    Rani Rana, Sandhya; Pattnaik, A. B.; Patnaik, S. C.

    2018-03-01

    In the present work the wear behavior and mechanical properties of as cast A16082 and A16086-T6 were compared and analyzed using statistical analysis. The as cast Al6082 alloy was solutionized at 550°C, quenched and artificially aged at 170°C for 8hrs. Metallographic examination and XRD analysis revealed the presence of intermetallic compounds Al6Mn.Hardness of heat treated Al6082 was found to be more than as cast sample. Wear tests were carried out using Pin on Disc wear testing machine according to Taguchi L9 orthogonal array. Experiments were conducted under normal load 10-30N, sliding speed 1-3m/s, sliding distance 400,800,1200m respectively. Sliding speed was found to be the dominant factor for wear in both as cast and aged Al 6082 alloy. Sliding distance increases the wear rate up to 800m and then after it decreases.

  4. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    Science.gov (United States)

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Comparison of background EEG activity of different groups of patients with idiopathic epilepsy using Shannon spectral entropy and cluster-based permutation statistical testing.

    Directory of Open Access Journals (Sweden)

    Jose Antonio Urigüen

    Full Text Available Idiopathic epilepsy is characterized by generalized seizures with no apparent cause. One of its main problems is the lack of biomarkers to monitor the evolution of patients. The only tools they can use are limited to inspecting the amount of seizures during previous periods of time and assessing the existence of interictal discharges. As a result, there is a need for improving the tools to assist the diagnosis and follow up of these patients. The goal of the present study is to compare and find a way to differentiate between two groups of patients suffering from idiopathic epilepsy, one group that could be followed-up by means of specific electroencephalographic (EEG signatures (intercritical activity present, and another one that could not due to the absence of these markers. To do that, we analyzed the background EEG activity of each in the absence of seizures and epileptic intercritical activity. We used the Shannon spectral entropy (SSE as a metric to discriminate between the two groups and performed permutation-based statistical tests to detect the set of frequencies that show significant differences. By constraining the spectral entropy estimation to the [6.25-12.89 Hz range, we detect statistical differences (at below 0.05 alpha-level between both types of epileptic patients at all available recording channels. Interestingly, entropy values follow a trend that is inversely related to the elapsed time from the last seizure. Indeed, this trend shows asymptotical convergence to the SSE values measured in a group of healthy subjects, which present SSE values lower than any of the two groups of patients. All these results suggest that the SSE, measured in a specific range of frequencies, could serve to follow up the evolution of patients suffering from idiopathic epilepsy. Future studies remain to be conducted in order to assess the predictive value of this approach for the anticipation of seizures.

  6. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: an intra-individual comparison.

    Science.gov (United States)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-09-01

    To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Twenty-six patients (M:F=13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, PASIR studies (20.81 vs. 16.67, P=0.004), but was not different in the aorta (18.23 vs. 18.72, P=0.726). The subjective image quality demonstrated no difference between the two studies. A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)

    2012-09-15

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.

  8. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    International Nuclear Information System (INIS)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-01-01

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality

  9. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  10. Accounting for center in the Early External Cephalic Version trials: an empirical comparison of statistical methods to adjust for center in a multicenter trial with binary outcomes.

    Science.gov (United States)

    Reitsma, Angela; Chu, Rong; Thorpe, Julia; McDonald, Sarah; Thabane, Lehana; Hutton, Eileen

    2014-09-26

    Clustering of outcomes at centers involved in multicenter trials is a type of center effect. The Consolidated Standards of Reporting Trials Statement recommends that multicenter randomized controlled trials (RCTs) should account for center effects in their analysis, however most do not. The Early External Cephalic Version (EECV) trials published in 2003 and 2011 stratified by center at randomization, but did not account for center in the analyses, and due to the nature of the intervention and number of centers, may have been prone to center effects. Using data from the EECV trials, we undertook an empirical study to compare various statistical approaches to account for center effect while estimating the impact of external cephalic version timing (early or delayed) on the outcomes of cesarean section, preterm birth, and non-cephalic presentation at the time of birth. The data from the EECV pilot trial and the EECV2 trial were merged into one dataset. Fisher's exact method was used to test the overall effect of external cephalic version timing unadjusted for center effects. Seven statistical models that accounted for center effects were applied to the data. The models included: i) the Mantel-Haenszel test, ii) logistic regression with fixed center effect and fixed treatment effect, iii) center-size weighted and iv) un-weighted logistic regression with fixed center effect and fixed treatment-by-center interaction, iv) logistic regression with random center effect and fixed treatment effect, v) logistic regression with random center effect and random treatment-by-center interaction, and vi) generalized estimating equations. For each of the three outcomes of interest approaches to account for center effect did not alter the overall findings of the trial. The results were similar for the majority of the methods used to adjust for center, illustrating the robustness of the findings. Despite literature that suggests center effect can change the estimate of effect in

  11. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    International Nuclear Information System (INIS)

    Choo, Ji Yung; Goo, Jin Mo; Park, Chang Min; Park, Sang Joon; Lee, Chang Hyun; Shim, Mi-Suk

    2014-01-01

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  12. Statistical comparison of leaching behavior of incineration bottom ash using seawater and deionized water: Significant findings based on several leaching methods.

    Science.gov (United States)

    Yin, Ke; Dou, Xiaomin; Ren, Fei; Chan, Wei-Ping; Chang, Victor Wei-Chung

    2018-02-15

    Bottom ashes generated from municipal solid waste incineration have gained increasing popularity as alternative construction materials, however, they contains elevated heavy metals posing a challenge for its free usage. Different leaching methods are developed to quantify leaching potential of incineration bottom ashes meanwhile guide its environmentally friendly application. Yet, there are diverse IBA applications while the in situ environment is always complicated, challenging its legislation. In this study, leaching tests were conveyed using batch and column leaching methods with seawater as opposed to deionized water, to unveil the metal leaching potential of IBA subjected to salty environment, which is commonly encountered when using IBA in land reclamation yet not well understood. Statistical analysis for different leaching methods suggested disparate performance between seawater and deionized water primarily ascribed to ionic strength. Impacts of leachant are metal-specific dependent on leaching methods and have a function of intrinsic characteristics of incineration bottom ashes. Leaching performances were further compared on additional perspectives, e.g. leaching approach and liquid to solid ratio, indicating sophisticated leaching potentials dominated by combined geochemistry. It is necessary to develop application-oriented leaching methods with corresponding leaching criteria to preclude discriminations between different applications, e.g., terrestrial applications vs. land reclamation. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A comparison of artificial neural networks with other statistical approaches for the prediction of true metabolizable energy of meat and bone meal.

    Science.gov (United States)

    Perai, A H; Nassiri Moghaddam, H; Asadpour, S; Bahrampour, J; Mansoori, Gh

    2010-07-01

    There has been a considerable and continuous interest to develop equations for rapid and accurate prediction of the ME of meat and bone meal. In this study, an artificial neural network (ANN), a partial least squares (PLS), and a multiple linear regression (MLR) statistical method were used to predict the TME(n) of meat and bone meal based on its CP, ether extract, and ash content. The accuracy of the models was calculated by R(2) value, MS error, mean absolute percentage error, mean absolute deviation, bias, and Theil's U. The predictive ability of an ANN was compared with a PLS and a MLR model using the same training data sets. The squared regression coefficients of prediction for the MLR, PLS, and ANN models were 0.38, 0.36, and 0.94, respectively. The results revealed that ANN produced more accurate predictions of TME(n) as compared with PLS and MLR methods. Based on the results of this study, ANN could be used as a promising approach for rapid prediction of nutritive value of meat and bone meal.

  14. Comparison of neutron capture cross sections obtained from two Hauser-Feshbach statistical models on a short-lived nucleus using experimentally constrained input

    Science.gov (United States)

    Lewis, Rebecca; Liddick, Sean; Spyrou, Artemis; Crider, Benjamin; Dombos, Alexander; Naqvi, Farheen; Prokop, Christopher; Quinn, Stephen; Larsen, Ann-Cecilie; Crespo Campo, Lucia; Guttormsen, Magne; Renstrom, Therese; Siem, Sunniva; Bleuel, Darren; Couture, Aaron; Mosby, Shea; Perdikakis, George

    2017-09-01

    A majority of the abundance of the elements above iron are produced by neutron capture reactions, and, in explosive stellar processes, many of these reactions take place on unstable nuclei. Direct neutron capture experiments can only be performed on stable and long-lived nuclei, requiring indirect methods for the remaining isotopes. Statistical neutron capture can be described using the nuclear level density (NLD), the γ strength function (γSF), and an optical model. The NLD and γSF can be obtained using the β-Oslo method. The NLD and γSF were recently determined for 74Zn using the β-Oslo method, and were used in both TALYS and CoH to calculate the 73Zn(n, γ)74Zn neutron capture cross section. The cross sections calculated in TALYS and CoH are expected to be identical if the inputs for both codes are the same, however, after a thorough investigation into the inputs for the 73Zn(n, γ)74Zn reaction there is still a factor of two discrepancy between the two codes.

  15. Statistical against dynamical PLF fission as seen by the IMF-IMF correlation functions and comparisons with CoMD model

    Science.gov (United States)

    Pagano, E. V.; Acosta, L.; Auditore, L.; Cap, T.; Cardella, G.; Colonna, M.; De Filippo, E.; Geraci, E.; Gnoffo, B.; Lanzalone, G.; Maiolino, C.; Martorana, N.; Pagano, A.; Papa, M.; Piasecki, E.; Pirrone, S.; Politi, G.; Porto, F.; Quattrocchi, L.; Rizzo, F.; Russotto, P.; Trifiro’, A.; Trimarchi, M.; Siwek-Wilczynska, K.

    2018-05-01

    In nuclear reactions at Fermi energies two and multi particles intensity interferometry correlation methods are powerful tools in order to pin down the characteristic time scale of the emission processes. In this paper we summarize an improved application of the fragment-fragment correlation function in the specific physics case of heavy projectile-like (PLF) binary massive splitting in two fragments of intermediate mass(IMF). Results are shown for the reverse kinematics reaction 124 Sn+64 Ni at 35 AMeV that has been investigated by using the forward part of CHIMERA multi-detector. The analysis was performed as a function of the charge asymmetry of the observed couples of IMF. We show a coexistence of dynamical and statistical components as a function of the charge asymmetry. Transport CoMD simulations are compared with the data in order to pin down the timescale of the fragments production and the relevant ingredients of the in medium effective interaction used in the transport calculations.

  16. Comparison of the image qualities of filtered back-projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction for CT venography at 80 kVp

    International Nuclear Information System (INIS)

    Kim, Jin Hyeok; Choo, Ki Seok; Moon, Tae Yong; Lee, Jun Woo; Jeon, Ung Bae; Kim, Tae Un; Hwang, Jae Yeon; Yun, Myeong-Ja; Jeong, Dong Wook; Lim, Soo Jin

    2016-01-01

    To evaluate the subjective and objective qualities of computed tomography (CT) venography images at 80 kVp using model-based iterative reconstruction (MBIR) and to compare these with those of filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) using the same CT data sets. Forty-four patients (mean age: 56.1 ± 18.1) who underwent 80 kVp CT venography (CTV) for the evaluation of deep vein thrombosis (DVT) during 4 months were enrolled in this retrospective study. The same raw data were reconstructed using FBP, ASIR, and MBIR. Objective and subjective image analysis were performed at the inferior vena cava (IVC), femoral vein, and popliteal vein. The mean CNR of MBIR was significantly greater than those of FBP and ASIR and images reconstructed using MBIR had significantly lower objective image noise (p <.001). Subjective image quality and confidence of detecting DVT by MBIR group were significantly greater than those of FBP and ASIR (p <.005), and MBIR had the lowest score for subjective image noise (p <.001). CTV at 80 kVp with MBIR was superior to FBP and ASIR regarding subjective and objective image qualities. (orig.)

  17. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  18. Tower of London test: a comparison between conventional statistic approach and modelling based on artificial neural network in differentiating fronto-temporal dementia from Alzheimer's disease.

    Science.gov (United States)

    Franceschi, Massimo; Caffarra, Paolo; Savarè, Rita; Cerutti, Renata; Grossi, Enzo

    2011-01-01

    The early differentiation of Alzheimer's disease (AD) from frontotemporal dementia (FTD) may be difficult. The Tower of London (ToL), thought to assess executive functions such as planning and visuo-spatial working memory, could help in this purpose. Twentytwo Dementia Centers consecutively recruited patients with early FTD or AD. ToL performances of these groups were analyzed using both the conventional statistical approaches and the Artificial Neural Networks (ANNs) modelling. Ninety-four non aphasic FTD and 160 AD patients were recruited. ToL Accuracy Score (AS) significantly (p advanced ANNs developed by Semeion Institute. The best ANNs were selected and submitted to ROC curves. The non-linear model was able to discriminate FTD from AD with an average AUC for 7 independent trials of 0.82. The use of hidden information contained in the different items of ToL and the non linear processing of the data through ANNs allows a high discrimination between FTD and AD in individual patients.

  19. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  20. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  1. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a Public Health Masters degree.

    Science.gov (United States)

    de Jong, N; Verstegen, D M L; Tan, F E S; O'Connor, S J

    2013-05-01

    This case-study compared traditional, face-to-face classroom-based teaching with asynchronous online learning and teaching methods in two sets of students undertaking a problem-based learning module in the multilevel and exploratory factor analysis of longitudinal data as part of a Masters degree in Public Health at Maastricht University. Students were allocated to one of the two study variants on the basis of their enrolment status as full-time or part-time students. Full-time students (n = 11) followed the classroom-based variant and part-time students (n = 12) followed the online asynchronous variant which included video recorded lectures and a series of asynchronous online group or individual SPSS activities with synchronous tutor feedback. A validated student motivation questionnaire was administered to both groups of students at the start of the study and a second questionnaire was administered at the end of the module. This elicited data about student satisfaction with the module content, teaching and learning methods, and tutor feedback. The module coordinator and problem-based learning tutor were also interviewed about their experience of delivering the experimental online variant and asked to evaluate its success in relation to student attainment of the module's learning outcomes. Student examination results were also compared between the two groups. Asynchronous online teaching and learning methods proved to be an acceptable alternative to classroom-based teaching for both students and staff. Educational outcomes were similar for both groups, but importantly, there was no evidence that the asynchronous online delivery of module content disadvantaged part-time students in comparison to their full-time counterparts.

  2. Statistical multistep direct and statistical multistep compound models for calculations of nuclear data for applications

    International Nuclear Information System (INIS)

    Seeliger, D.

    1993-01-01

    This contribution contains a brief presentation and comparison of the different Statistical Multistep Approaches, presently available for practical nuclear data calculations. (author). 46 refs, 5 figs

  3. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  4. Voxel based statistical analysis method for microPET studies to assess the cerebral glucose metabolism in cat deafness model: comparison to ROI based method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Su; Lee, Jae Sung; Park, Min Hyun; Lee, Jong Jin; Kang, Hye Jin; Lee, Hyo Jeong; Oh, Seung Ha; Kim, Chong Sun; Jung, June Key; Lee, Myung Chul; Lee, Dong Soo [Seoul National University College of Medicine, Seoul (Korea, Republic of); Lim, Sang Moo [KIRAMS, Seoul (Korea, Republic of)

    2005-07-01

    Imaging research on the brain of sensory-deprived cats using small animal PET scanner has gained interest since the abundant information about the sensory system of ths animal is available and close examination of the brain is possible due to larger size of its brain than mouse or rat. In this study, we have established the procedures for 3D voxel-based statistical analysis (SPM) of FDG PET image of cat brain, and confirmed using ROI based-method. FDG PET scans of 4 normal and 4 deaf cats were acquired for 30 minutes using microPET R4 scanner. Only the brain cortices were extracted using a masking and threshold method to facilitate spatial normalization. After spatial normalization and smoothing, 3D voxel-wise and ROI based t-test were performed to identify the regions with significant different FDG uptake between the normal and deaf cats. In ROI analysis, 26 ROIs were drawn on both hemispheres, and regional mean pixel value in each ROI was normalized to the global mean of the brain. Cat brains were spatially normalized well onto the target brain due to the removal of background activity. When cerebral glucose metabolism of deaf cats were compared to the normal controls after removing the effects of the global count, the glucose metabolism in the auditory cortex, head of caudate nucleus, and thalamus in both hemispheres of the deaf cats was significantly lower than that of the controls (P<0.01). No area showed a significantly increased metabolism in the deaf cats even in higher significance level (P<0.05). ROI analysis also showed significant reduction of glucose metabolism in the same region. This study established and confirmed a method for voxel-based analysis of animal PET data of cat brain, which showed high localization accuracy and specificity and was useful for examining the cerebral glucose metabolism in a cat cortical deafness model.

  5. Voxel based statistical analysis method for microPET studies to assess the cerebral glucose metabolism in cat deafness model: comparison to ROI based method

    International Nuclear Information System (INIS)

    Kim, Jin Su; Lee, Jae Sung; Park, Min Hyun; Lee, Jong Jin; Kang, Hye Jin; Lee, Hyo Jeong; Oh, Seung Ha; Kim, Chong Sun; Jung, June Key; Lee, Myung Chul; Lee, Dong Soo; Lim, Sang Moo

    2005-01-01

    Imaging research on the brain of sensory-deprived cats using small animal PET scanner has gained interest since the abundant information about the sensory system of ths animal is available and close examination of the brain is possible due to larger size of its brain than mouse or rat. In this study, we have established the procedures for 3D voxel-based statistical analysis (SPM) of FDG PET image of cat brain, and confirmed using ROI based-method. FDG PET scans of 4 normal and 4 deaf cats were acquired for 30 minutes using microPET R4 scanner. Only the brain cortices were extracted using a masking and threshold method to facilitate spatial normalization. After spatial normalization and smoothing, 3D voxel-wise and ROI based t-test were performed to identify the regions with significant different FDG uptake between the normal and deaf cats. In ROI analysis, 26 ROIs were drawn on both hemispheres, and regional mean pixel value in each ROI was normalized to the global mean of the brain. Cat brains were spatially normalized well onto the target brain due to the removal of background activity. When cerebral glucose metabolism of deaf cats were compared to the normal controls after removing the effects of the global count, the glucose metabolism in the auditory cortex, head of caudate nucleus, and thalamus in both hemispheres of the deaf cats was significantly lower than that of the controls (P<0.01). No area showed a significantly increased metabolism in the deaf cats even in higher significance level (P<0.05). ROI analysis also showed significant reduction of glucose metabolism in the same region. This study established and confirmed a method for voxel-based analysis of animal PET data of cat brain, which showed high localization accuracy and specificity and was useful for examining the cerebral glucose metabolism in a cat cortical deafness model

  6. The lifetime cost to English students of borrowing to invest in a medical degree: a gender comparison using data from the Office for National Statistics.

    Science.gov (United States)

    Ercolani, Marco G; Vohra, Ravinder S; Carmichael, Fiona; Mangat, Karanjit; Alderson, Derek

    2015-04-21

    To evaluate this impact on male and female English medical graduates by estimating the total time and amount repaid on loans taken out with the UK's Student Loans Company (SLC). UK. 4286 respondents with a medical degree in the Labour Force Surveys administered by the Office for National Statistics (ONS) between 1997 and 2014. Age-salary profiles were generated to estimate the repayment profiles for different levels of initial graduate debt. 2195 female and 2149 male medical graduates were interviewed by the ONS. Those working full-time (73.1% females and 96.1% males) were analysed in greater depth. Following standardisation to 2014 prices, average full-time male graduates earned up to 35% more than females by the age of 55. The initial graduate debt from tuition fees alone amounts to £39,945.69. Owing to interest charges on this debt the average full-time male graduate repays £57,303 over 20 years, while the average female earns less and so repays £61,809 over 26 years. When additional SLC loans are required for maintenance, the initial graduate debt can be as high as £81,916 and, as SLC debt is written off 30 years after graduation, the average female repays £75,786 while the average male repays £110,644. Medical graduates on an average salary are unlikely to repay their SLC debt in full. This is a consequence of higher university fees and as SLC debt is written off 30 years after graduation. This results in the average female graduate repaying more when debt is low, but a lower amount when debt is high compared to male graduates. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. The effect of filtrating and reconstruction method on the left ventricular ejection fraction derived from GSPET. A statistical comparison of angiography and echocardiography

    International Nuclear Information System (INIS)

    Bitarafan, A.; Rajabi, H.

    2008-01-01

    There are different protocols of reconstruction in myocardial gated imaging that produce different values of left ventricular ejection fraction (EF). We attempted to determine how the parameters of reconstruction affect the calculated EF. The results were statistically compared with the values obtained from angiography and echocardiography. In this retrospective study, the data from 23 patients were used. All the patients had the angiographic and the echocardiographic data within 2 weeks before the test. Imaging was performed using a single-head gamma camera using technetium-99 methoxyisobutylisonitrile. The image data were reconstructed using 50 different combinations of the ramp, Hanning, Butterworth, Wiener, and Metz filters. The ordered subset expectation maximization (OSEM) technique was also examined using 12 combinations of iteration and subset. The calculated EF values were analyzed and compared with the echocardiographic and angiographic results. The backprojection technique produced higher values of EF than those derived from echocardiography and angiography. The OSEM on the other hand produced lower values when compared with echocardiography and angiography. On using the backprojection technique, the maximum correlation between the values derived from gated single-photon emission tomography and echocardiography (r=0.88, P<0.01) and angiography (r=0.81, P<0.01) was observed when using the Metz filter (full width at half maximum=5 mm and order=9) and the Gaussian filter (α=3), respectively. In the case of the OSEM technique, the maximum correlation with both angiography and echocardiography was observed when using the iteration=2 and the subset=12. On the average, the backprojection technique produces higher values, and iteration technique produces lower estimation of the EF when compared with angiography and echocardiography. (author)

  8. Automated Tree Crown Delineation and Biomass Estimation from Airborne LiDAR data: A Comparison of Statistical and Machine Learning Methods

    Science.gov (United States)

    Gleason, C. J.; Im, J.

    2011-12-01

    Airborne LiDAR remote sensing has been used effectively in assessing forest biomass because of its canopy penetrating effects and its ability to accurately describe the canopy surface. Current research in assessing biomass using airborne LiDAR focuses on either the individual tree as a base unit of study or statistical representations of a small aggregation of trees (i.e., plot level), and both methods usually rely on regression against field data to model the relationship between the LiDAR-derived data (e.g., volume) and biomass. This study estimates biomass for mixed forests and coniferous plantations (Picea Abies) within Heiberg Memorial Forest, Tully, NY, at both the plot and individual tree level. Plots are regularly spaced with a radius of 13m, and field data include diameter at breast height (dbh), tree height, and tree species. Field data collection and LiDAR data acquisition were seasonally coincident and both obtained in August of 2010. Resulting point cloud density was >5pts/m2. LiDAR data were processed to provide a canopy height surface, and a combination of watershed segmentation, active contouring, and genetic algorithm optimization was applied to delineate individual trees from the surface. This updated delineation method was shown to be more accurate than traditional watershed segmentation. Once trees had been delineated, four biomass estimation models were applied and compared: support vector regression (SVR), linear mixed effects regression (LME), random forest (RF), and Cubist regression. Candidate variables to be used in modeling were derived from the LiDAR surface, and include metrics of height, width, and volume per delineated tree footprint. Previously published allometric equations provided field estimates of biomass to inform the regressions and calculate their accuracy via leave-one-out cross validation. This study found that for forests such as found in the study area, aggregation of individual trees to form a plot-based estimate of

  9. Comparison of past and future Mediterranean high and low extremes of precipitation and river flow projected using different statistical downscaling methods

    Directory of Open Access Journals (Sweden)

    P. Quintana-Seguí

    2011-05-01

    Full Text Available The extremes of precipitation and river flow obtained using three different statistical downscaling methods applied to the same regional climate simulation have been compared. The methods compared are the anomaly method, quantile mapping and a weather typing. The hydrological model used in the study is distributed and it is applied to the Mediterranean basins of France. The study shows that both quantile mapping and weather typing methods are able to reproduce the high and low precipitation extremes in the region of interest. The study also shows that when the hydrological model is forced with these downscaled data, there are important differences in the outputs. This shows that the model amplifies the differences and that the downscaling of other atmospheric variables might be very relevant when simulating river discharges. In terms of river flow, the method of the anomalies, which is very simple, performs better than expected. The methods produce qualitatively similar future scenarios of the extremes of river flow. However, quantitatively, there are still significant differences between them for each individual gauging station. According to these scenarios, it is expected that in the middle of the 21st century (2035–2064, the monthly low flows will have diminished almost everywhere in the region of our study by as much as 20 %. Regarding high flows, there will be important increases in the area of the Cévennes, which is already seriously affected by flash-floods. For some gauging stations in this area, the frequency of what was a 10-yr return flood at the end of the 20th century is expected to increase, with such return floods then occurring every two years in the middle of the 21st century. Similarly, the 10-yr return floods at that time are expected to carry 100 % more water than the 10-yr return floods experienced at the end of the 20th century. In the northern part of the Rhône basin, these extremes will be reduced.

  10. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 ± 3.00) than low-dose ASIR (49.24 ± 9.11, P < 0.01) and reference-dose ASIR images (24.93 ± 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  11. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Bunkyo-ku, Tokyo (Japan)

    2012-08-15

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 {+-} 3.00) than low-dose ASIR (49.24 {+-} 9.11, P < 0.01) and reference-dose ASIR images (24.93 {+-} 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  12. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Böning, G; Schäfer, M; Grupp, U; Kaul, D; Kahn, J; Pavel, M; Maurer, M; Denecke, T; Hamm, B; Streitparth, F

    2015-08-01

    To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. ASIR 40% significantly reduced CTDIvol (10.17±3.06mGy [FBP], 6.34±2.25mGy [ASIR] (pASIR]) (pASIR]) (pASIR]), visibility of suspicious lesion (4.8±0.5 [FBP], 4.8±0.5 [ASIR]) and artifacts (5.0±0 [FBP], 5.0±0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3±0.6 [FBP], 4.0±0.8 [ASIR]) (pASIR]) (pASIR]) (pASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  14. Error correction and statistical analyses for intra-host comparisons of feline immunodeficiency virus diversity from high-throughput sequencing data.

    Science.gov (United States)

    Liu, Yang; Chiaromonte, Francesca; Ross, Howard; Malhotra, Raunaq; Elleder, Daniel; Poss, Mary

    2015-06-30

    in G to A substitutions, but found no evidence for this host defense strategy. Our error correction approach for minor allele frequencies (more sensitive and computationally efficient than other algorithms) and our statistical treatment of variation (ANOVA) were critical for effective use of high-throughput sequencing data in understanding viral diversity. We found that co-infection with PLV shifts FIV diversity from bone marrow to lymph node and spleen.

  15. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    International Nuclear Information System (INIS)

    Böning, G.; Schäfer, M.; Grupp, U.; Kaul, D.; Kahn, J.; Pavel, M.; Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F.

    2015-01-01

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI vol ) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI vol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  16. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)

    2015-08-15

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  17. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  18. Comparison on batch anaerobic digestion of five different livestock manures and prediction of biochemical methane potential (BMP) using different statistical models.

    Science.gov (United States)

    Kafle, Gopi Krishna; Chen, Lide

    2016-02-01

    There is a lack of literature reporting the methane potential of several livestock manures under the same anaerobic digestion conditions (same inoculum, temperature, time, and size of the digester). To the best of our knowledge, no previous study has reported biochemical methane potential (BMP) predicting models developed and evaluated by solely using at least five different livestock manure tests results. The goal of this study was to evaluate the BMP of five different livestock manures (dairy manure (DM), horse manure (HM), goat manure (GM), chicken manure (CM) and swine manure (SM)) and to predict the BMP using different statistical models. Nutrients of the digested different manures were also monitored. The BMP tests were conducted under mesophilic temperatures with a manure loading factor of 3.5g volatile solids (VS)/L and a feed to inoculum ratio (F/I) of 0.5. Single variable and multiple variable regression models were developed using manure total carbohydrate (TC), crude protein (CP), total fat (TF), lignin (LIG) and acid detergent fiber (ADF), and measured BMP data. Three different kinetic models (first order kinetic model, modified Gompertz model and Chen and Hashimoto model) were evaluated for BMP predictions. The BMPs of DM, HM, GM, CM and SM were measured to be 204, 155, 159, 259, and 323mL/g VS, respectively and the VS removals were calculated to be 58.6%, 52.9%, 46.4%, 81.4%, 81.4%, respectively. The technical digestion time (T80-90, time required to produce 80-90% of total biogas production) for DM, HM, GM, CM and SM was calculated to be in the ranges of 19-28, 27-37, 31-44, 13-18, 12-17days, respectively. The effluents from the HM showed the lowest nitrogen, phosphorus and potassium concentrations. The effluents from the CM digesters showed highest nitrogen and phosphorus concentrations and digested SM showed highest potassium concentration. Based on the results of the regression analysis, the model using the variable of LIG showed the best (R(2

  19. Patient-generated aspects in oral rehabilitation decision making I. Comparison of traditional history taking and an individual systematic interview method

    DEFF Research Database (Denmark)

    Øzhayat, Esben Boeskov; Gotfredsen, Klaus; Elverdam, Beth

    2009-01-01

    it with a traditional history taking, in generating information to be used in decision making in oral rehabilitation. Fifty-seven participants in need of oral rehabilitation were enrolled in the study. The participants underwent a traditional history taking and were interviewed using the SEIQoL-DW method. The SEIQo......Decision making in oral rehabilitation is often based on diagnoses related to impairment of different oral functions. In making the decision when to treat, the dentist must work in cooperation with the patient. By incorporating patient-generated aspects into the decision making process, the dentist...... percentage of the participants were positive towards the use of the SEIQoL-DW method in their treatment planning. The SEIQoL-DW was considered to be a viable tool for decision making in oral rehabilitation....

  20. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  1. Performance comparison of low-grade ORCs (organic Rankine cycles) using R245fa, pentane and their mixtures based on the thermoeconomic multi-objective optimization and decision makings

    International Nuclear Information System (INIS)

    Feng, Yongqiang; Hung, TzuChen; Zhang, Yaning; Li, Bingxi; Yang, Jinfu; Shi, Yang

    2015-01-01

    Based on the thermoeconomic multi-objective optimization and decision makings, considering both exergy efficiency and LEC (levelized energy cost), the performance comparison of low-grade ORCs (organic Rankine cycles) using R245fa, pentane and their mixtures has been investigated. The effects of mass fraction of R245fa and four key parameters on the exergy efficiency and LEC are examined. The Pareto-optimal solutions are selected from the Pareto optimal frontier obtained by NSGA-II algorithm using three decision makings, including Shannon Entropy, LINMAP and TOPSIS. The deviation index is introduced to evaluate different decision makings. Research demonstrates that as the mass fraction of R245fa increasing, the exergy efficiency decreases first and then increases, while LEC presents a reverse trend. The optimum values from TOPSIS decision making are selected as the preferred Pareto-optimal solution for its lowest deviation index. The Pareto-optimal solutions for pentane, R245fa, and 0.5pentane/0.5R245fa in pairs of (exergy efficiency, LEC) are (0.5425, 0.104), (0.5502, 0.111), and (0.5212, 0.108), respectively. The mixture working fluids present lower thermodynamic performance and moderate economic performance than the pure working fluids under the Pareto optimization. - Highlights: • The thermoeconomic comparison between pure and mixture working fluids is investigated. • The Pareto-optimal solutions with bi-objective function using three decision makings are obtained. • The optimum values from TOPSIS decision making are selected as the preferred Pareto-optimal solution. • The mixture yields lower thermodynamic performance and moderate economic performance.

  2. Statistical theory and inference

    CERN Document Server

    Olive, David J

    2014-01-01

    This text is for  a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful  tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.

  3. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  4. The impact of national culture in MNC's home country on the strategy making process in their overseas subsidiaries: a comparison between Dutch and Japanese companies in Thailand

    NARCIS (Netherlands)

    Meekanon, K.

    2002-01-01

    The main purpose of this dissertation is to understand that the national cultures of MNCs' home countries play an important role in determining the SMPs in their overseas subsidiaries. To attain this purpose, a comparison between Dutch and

  5. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    Science.gov (United States)

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  6. Statistical summary 1990-91

    International Nuclear Information System (INIS)

    1991-01-01

    The information contained in this statistical summary leaflet summarizes in bar charts or pie charts Nuclear Electric's performance in 1990-91 in the areas of finance, plant and plant operations, safety, commercial operations and manpower. It is intended that the information will provide a basis for comparison in future years. The leaflet also includes a summary of Nuclear Electric's environmental policy statement. (UK)

  7. The Euclid Statistical Matrix Tool

    Directory of Open Access Journals (Sweden)

    Curtis Tilves

    2017-06-01

    Full Text Available Stataphobia, a term used to describe the fear of statistics and research methods, can result from a lack of improper training in statistical methods. Poor statistical methods training can have an effect on health policy decision making and may play a role in the low research productivity seen in developing countries. One way to reduce Stataphobia is to intervene in the teaching of statistics in the classroom; however, such an intervention must tackle several obstacles, including student interest in the material, multiple ways of learning materials, and language barriers. We present here the Euclid Statistical Matrix, a tool for combatting Stataphobia on a global scale. This free tool is comprised of popular statistical YouTube channels and web sources that teach and demonstrate statistical concepts in a variety of presentation methods. Working with international teams in Iran, Japan, Egypt, Russia, and the United States, we have also developed the Statistical Matrix in multiple languages to address language barriers to learning statistics. By utilizing already-established large networks, we are able to disseminate our tool to thousands of Farsi-speaking university faculty and students in Iran and the United States. Future dissemination of the Euclid Statistical Matrix throughout the Central Asia and support from local universities may help to combat low research productivity in this region.

  8. Decision Making

    Directory of Open Access Journals (Sweden)

    Pier Luigi Baldi

    2006-06-01

    Full Text Available This article points out some conditions which significantly exert an influence upon decision and compares decision making and problem solving as interconnected processes. Some strategies of decision making are also examined.

  9. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  10. Making a Positive Impression about the Mission of an Urban, Catholic University: Gender, First-Generation College, and Religious Preference Comparisons

    Science.gov (United States)

    Ferrari, Joseph R.; Mader, Megan C.; Milner, Lauren A.; Temperato, John R.

    2010-01-01

    This study investigates how research participants' desire to make a positive social impression may affect their responses to survey questions. Specifically, participants may react in socially appropriate ways to create a positive social impression for those persons reviewing their responses. This concept is termed "impression management," or more…

  11. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  12. Statistical insights from Romanian data on higher education

    Directory of Open Access Journals (Sweden)

    Andreea Ardelean

    2015-09-01

    Full Text Available This paper aims to use cluster analysis to make a comparative analysis at regional level concerning the Romanian higher education. The evolution of higher education in post-communist period will also be presented, using quantitative traits. Although the focus is on university education, this will also include references to the total education by comparison. Then, to highlight the importance of higher education, the chi-square test will be applied to check whether there is an association between statistical regions and education level of the unemployed.

  13. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus...... and mind. Statistics represents a quintessential part of such investigations as they are preluded by a clinical hypothesis that must be verified based on observed data. The massive amounts of image data produced in each examination pose an important and interesting statistical challenge...... efficient algorithms which make the analysis of large data sets feasible, and gives examples of applications....

  14. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  15. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  16. Six Sigma and Introductory Statistics Education

    Science.gov (United States)

    Maleyeff, John; Kaminsky, Frank C.

    2002-01-01

    A conflict exists between the way statistics is practiced in contemporary business environments and the way statistics is taught in schools of management. While businesses are embracing programs, such as six sigma and TQM, that bring statistical methods to the forefront of management decision making, students do not graduate with the skills to…

  17. The Role of Statistics in Kosovo Enterprises

    Science.gov (United States)

    Gjonbalaj, Muje; Dema, Marjan; Miftari, Iliriana

    2009-01-01

    Considering science as the main contributor to contemporary developments has encouraged us to raise a scientific discussion regarding the role of statistics in business decision-making and economic development. Statistics, as an applicative science, is growing and being widely applied in different fields and professions. Statistical thinking is…

  18. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  19. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  20. Shot Group Statistics for Small Arms Applications

    Science.gov (United States)

    2017-06-01

    if its probability distribution is known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown... statistical inference on the unknown, population standard deviations of the x and y impact-point positions. The dispersion measures treated in this report...known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown, population standard deviations of the x and y

  1. Impact of climate change on Precipitation and temperature under the RCP 8.5 and A1B scenarios in an Alpine Cathment (Alto-Genil Basin,southeast Spain). A comparison of statistical downscaling methods

    Science.gov (United States)

    Pulido-Velazquez, David; Juan Collados-Lara, Antonio; Pardo-Iguzquiza, Eulogio; Jimeno-Saez, Patricia; Fernandez-Chacon, Francisca

    2016-04-01

    comparison between results for scenario A1B and RCP8.5 has been performed. The reduction obtained for the mean rainfall respect to the historical are 24.2 % and 24.4 % respectively, and the increment in the temperature are 46.3 % and 31.2 % respectively. A sensitivity analysis of the results to the statistical downscaling techniques employed has been performed. The next techniques have been explored: Perturbation method or "delta-change"; Regression method (a regression function which relates the RCM and the historic information will be used to generate future climate series for the fixed period); Quantile mapping, (it attempts to find a transformation function which relates the observed variable and the modeled variable maintaining an statistical distribution equals the observed variable); Stochastic weather generator (SWG): They can be uni-site or multi-site (which considers the spatial correlation of climatic series). A comparative analysis of these techniques has been performed identifying the advantages and disadvantages of each of them. Acknowledgments: This research has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02, ENSEMBLES and CORDEX projects for the data provided for this study.

  2. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  3. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  4. The factors which affect the information system needs for decision making in the hotel industry (A comparison study between the U. K. and Egypt)

    OpenAIRE

    Hassan, Mahmoud Abdel Maksoud Mouhammed

    1986-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. This report contains evidence to show that the information provided by computers in the hotel industry, for the management work & decision making, has not changed too much since 1970. In most industries, the successful computer applications are clearly noticed in the routine work (clerical jobs) more than the management work (strategic & tactical jobs). As for the hotel industry the u...

  5. Dopamine antagonism decreases willingness to expend physical, but not cognitive, effort: a comparison of two rodent cost/benefit decision-making tasks.

    Science.gov (United States)

    Hosking, Jay G; Floresco, Stan B; Winstanley, Catharine A

    2015-03-01

    Successful decision making often requires weighing a given option's costs against its associated benefits, an ability that appears perturbed in virtually every severe mental illness. Animal models of such cost/benefit decision making overwhelmingly implicate mesolimbic dopamine in our willingness to exert effort for a larger reward. Until recently, however, animal models have invariably manipulated the degree of physical effort, whereas human studies of effort have primarily relied on cognitive costs. Dopamine's relationship to cognitive effort has not been directly examined, nor has the relationship between individuals' willingness to expend mental versus physical effort. It is therefore unclear whether willingness to work hard in one domain corresponds to willingness in the other. Here we utilize a rat cognitive effort task (rCET), wherein animals can choose to allocate greater visuospatial attention for a greater reward, and a previously established physical effort-discounting task (EDT) to examine dopaminergic and noradrenergic contributions to effort. The dopamine antagonists eticlopride and SCH23390 each decreased willingness to exert physical effort on the EDT; these drugs had no effect on willingness to exert mental effort for the rCET. Preference for the high effort option correlated across the two tasks, although this effect was transient. These results suggest that dopamine is only minimally involved in cost/benefit decision making with cognitive effort costs. The constructs of mental and physical effort may therefore comprise overlapping, but distinct, circuitry, and therapeutic interventions that prove efficacious in one effort domain may not be beneficial in another.

  6. Model : making

    OpenAIRE

    Bottle, Neil

    2013-01-01

    The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...

  7. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  8. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  9. Context influences decision-making in boys with attention-deficit/hyperactivity disorder: A comparison of traditional and novel choice-impulsivity paradigms.

    Science.gov (United States)

    Patros, Connor H G; Alderson, R Matt; Lea, Sarah E; Tarle, Stephanie J

    2017-02-01

    Attention-deficit/hyperactivity disorder (ADHD) is characterized by an impaired ability to maintain attention and/or hyperactivity/impulsivity. Impulsivity is frequently defined as the preference for small, immediate rewards over larger, delayed rewards, and has been associated with a variety of negative outcomes such as risky behavior and academic difficulty. Extant studies have uniformly utilized the traditional paradigm of presenting two response choices, which limits the generalization of findings to scenarios in which children/adolescents are faced with dichotomous decisions. The current study is the first to examine the effect of manipulating the number of available response options on impulsive decision-making in boys with and without ADHD. A total of 39 boys (ADHD = 16, typically developing [TD] = 23) aged 8-12 years completed a traditional two-choice impulsivity task and a novel five-choice impulsivity task to examine the effect of manipulating the number of choice responses (two vs five) on impulsive decision-making. A five-choice task was utilized as it presents a more continuous array of choice options when compared to the typical two-choice task, and is comparable given its methodological similarity to the two-choice task. Results suggested that boys with ADHD were significantly more impulsive than TD boys during the two-choice task, but not during the five-choice task. Collectively, these findings suggest that ADHD-related impulsivity is not ubiquitous, but rather dependent on variation in demands and/or context. Further, these findings highlight the importance of examining ADHD-related decision-making within the context of alternative paradigms, as the exclusive utilization of two-choice tasks may promote inaccurate conceptualizations of the disorder.

  10. Comparison of two models for the X-ray dispersion produced in a Novillo Tokamak with measurements make with thermoluminescent dosemeters

    International Nuclear Information System (INIS)

    Flores O, A.; Castillo, A.; Barocio, S.R.; Melendez L, L.; Chavez A, E.; Cruz C, G.J.; Lopez, R.; Olayo, M.G.; Gonzalez M, P.; Azorin N, J.

    1999-01-01

    It was presented the results to study about the X-ray dispersion produced in the Novillo Tokamak using thermoluminescent dosemeters (DTL). The measurements were make in the equatorial plane of Tokamak, along twelve radial directions. The dispersion is observed due to the radiation interaction with walls surrounding the machine. It was proposed two types of heuristic mathematical methods for describing the X-ray dispersion, comparing them with the experimental data obtained with Dtl. The predictions of both models are adjusted well to the experimental data. (Author)

  11. Inverting an Introductory Statistics Classroom

    Science.gov (United States)

    Kraut, Gertrud L.

    2015-01-01

    The inverted classroom allows more in-class time for inquiry-based learning and for working through more advanced problem-solving activities than does the traditional lecture class. The skills acquired in this learning environment offer benefits far beyond the statistics classroom. This paper discusses four ways that can make the inverted…

  12. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  13. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  14. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1989-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  15. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1990-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)

  16. Steel making

    CERN Document Server

    Chakrabarti, A K

    2014-01-01

    "Steel Making" is designed to give students a strong grounding in the theory and state-of-the-art practice of production of steels. This book is primarily focused to meet the needs of undergraduate metallurgical students and candidates for associate membership examinations of professional bodies (AMIIM, AMIE). Besides, for all engineering professionals working in steel plants who need to understand the basic principles of steel making, the text provides a sound introduction to the subject.Beginning with a brief introduction to the historical perspective and current status of steel making together with the reasons for obsolescence of Bessemer converter and open hearth processes, the book moves on to: elaborate the physiochemical principles involved in steel making; explain the operational principles and practices of the modern processes of primary steel making (LD converter, Q-BOP process, and electric furnace process); provide a summary of the developments in secondary refining of steels; discuss principles a...

  17. Statistics in action a Canadian outlook

    CERN Document Server

    Lawless, Jerald F

    2014-01-01

    Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c

  18. Satisfaction with care and decision making among parents/caregivers in the pediatric intensive care unit: a comparison between English-speaking whites and Latinos.

    Science.gov (United States)

    Epstein, David; Unger, Jennifer B; Ornelas, Beatriz; Chang, Jennifer C; Markovitz, Barry P; Dodek, Peter M; Heyland, Daren K; Gold, Jeffrey I

    2015-04-01

    Because of previously documented health care disparities, we hypothesized that English-speaking Latino parents/caregivers would be less satisfied with care and decision making than English-speaking non-Latino white (NLW) parents/caregivers. An intensive care unit (ICU) family satisfaction survey, Family Satisfaction in the Intensive Care Unit Survey (pediatric, 24 question version), was completed by English-speaking parents/caregivers of children in a cardiothoracic ICU at a university-affiliated children's hospital in 2011. English-speaking NLW and Latino parents/caregivers of patients, younger than 18 years, admitted to the ICU were approached to participate on hospital day 3 or 4 if they were at the bedside for greater than or equal to 2 days. Analysis of variance, χ(2), and Student t tests were used. Cronbach αs were calculated. Fifty parents/caregivers completed the survey in each group. Latino parents/caregivers were younger, more often mothers born outside the United States, more likely to have government insurance or no insurance, and had less education and income. There were no differences between the groups' mean overall satisfaction scores (92.6 ± 8.3 and 93.0 ± 7.1, respectively; P = .80). The Family Satisfaction in the Intensive Care Unit Survey (pediatric, 24 question version) showed high internal consistency reliability (α = .95 and .91 for NLW and Latino groups, respectively). No disparities in ICU satisfaction with care and decision making between English-speaking NLW and Latino parents/caregivers were found. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Statistical problems in nuclear regulation: introduction and overview

    International Nuclear Information System (INIS)

    Moore, R.H.; Easterling, R.G.

    1978-01-01

    The U.S. Nuclear Regulatory Commission (NRC) was organized formally in January 1975. The Commission's responsibilities can be categorized into four broad areas involving the licensing and use of nuclear materials and facilities: protecting public health and safety; protecting the environment; safeguarding nuclear materials and facilities; and assuring conformity with antitrust laws. A large variety of statistical problems are related to these basic responsibilities. They arise from the data-based nature of many of the issues to be resolved in making regulatory decisions. Hence, they are reflected in interactions among the NRC staff and licensees, vendors, and the public. This paper identifies and outlines some of these problems, providing a spectrum for comparison with the other presentations in this session. These problems are linked by the need for clear and objective treatment of data; their articulation and solution will benefit from insights and contributions from an informed statistical community

  20. User manual for Blossom statistical package for R

    Science.gov (United States)

    Talbert, Marian; Cade, Brian S.

    2005-01-01

    Blossom is an R package with functions for making statistical comparisons with distance-function based permutation tests developed by P.W. Mielke, Jr. and colleagues at Colorado State University (Mielke and Berry, 2001) and for testing parameters estimated in linear models with permutation procedures developed by B. S. Cade and colleagues at the Fort Collins Science Center, U.S. Geological Survey. This manual is intended to provide identical documentation of the statistical methods and interpretations as the manual by Cade and Richards (2005) does for the original Fortran program, but with changes made with respect to command inputs and outputs to reflect the new implementation as a package for R (R Development Core Team, 2012). This implementation in R has allowed for numerous improvements not supported by the Cade and Richards (2005) Fortran implementation, including use of categorical predictor variables in most routines.

  1. Data-driven inference for the spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  2. Data-driven inference for the spatial scan statistic.

    Science.gov (United States)

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  3. Statistical methods for forecasting

    CERN Document Server

    Abraham, Bovas

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...

  4. Statistical methods in radiation physics

    CERN Document Server

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  5. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  6. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  7. Depression screening in stroke: a comparison of alternative measures with the structured diagnostic interview for the diagnostic and statistical manual of mental disorders, fourth edition (major depressive episode) as criterion standard.

    Science.gov (United States)

    Turner, Alyna; Hambridge, John; White, Jennifer; Carter, Gregory; Clover, Kerrie; Nelson, Louise; Hackett, Maree

    2012-04-01

    Screening tools for depression and psychological distress commonly used in medical settings have not been well validated in stroke populations. We aimed to determine the accuracy of common screening tools for depression or distress in detecting caseness for a major depressive episode compared with a clinician-administered structured clinical interview for Diagnostic and Statistical Manual of Mental Disorders Fourth Edition as the gold standard. Seventy-two participants ≥3 weeks poststroke underwent a diagnostic interview for major depressive episode and completed the Patient Health Questionnaire-2 and -9, Hospital Anxiety and Depression Scale, Beck Depression Inventory-II, Distress Thermometer, and Kessler-10. Internal consistency, sensitivity, specificity, likelihood ratios, and posttest probabilities were calculated. Each measure was validated against the gold standard using receiver operating characteristic curves with comparison of the area under the curve for all measures. Internal consistency ranged from acceptable to excellent for all measures (Cronbach α=0.78-0.94). Areas under the curve (95% CI) for the Patient Health Questionnaire-2, Patient Health Questionnaire-9, Hospital Anxiety and Depression Scale depression and total score, Beck Depression Inventory-II, and Kessler-10 ranged from 0.80 (0.69-0.89) for the Kessler-10 to 0.89 (0.79-0.95) for the Beck Depression Inventory-II with no significant differences between measures. The Distress Thermometer had an area under the curve (95% CI) of 0.73 (0.61-0.83), significantly smaller than the Beck Depression Inventory-II (P<0.05). Apart from the Distress Thermometer, selected scales performed adequately in a stroke population with no significant difference between measures. The Patient Health Questionnaire-2 would be the most useful single screen given free availability and the shortest number of items.

  8. Make Sense?

    DEFF Research Database (Denmark)

    Gyrd-Jones, Richard; Törmälä, Minna

    Purpose: An important part of how we sense a brand is how we make sense of a brand. Sense-making is naturally strongly connected to how we cognize about the brand. But sense-making is concerned with multiple forms of knowledge that arise from our interpretation of the brand-related stimuli......: Declarative, episodic, procedural and sensory. Knowledge is given meaning through mental association (Keller, 1993) and / or symbolic interaction (Blumer, 1969). These meanings are centrally related to individuals’ sense of identity or “identity needs” (Wallpach & Woodside, 2009). The way individuals make...... sense of brands is related to who people think they are in their context and this shapes what they enact and how they interpret the brand (Currie & Brown, 2003; Weick, Sutcliffe, & Obstfeld, 2005; Weick, 1993). Our subject of interest in this paper is how stakeholders interpret and ascribe meaning...

  9. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  10. Conversion factors and oil statistics

    International Nuclear Information System (INIS)

    Karbuz, Sohbet

    2004-01-01

    World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers

  11. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  12. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  13. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  14. Comparison is key.

    Science.gov (United States)

    Stone, Mark H; Stenner, A Jackson

    2014-01-01

    Several concepts from Georg Rasch's last papers are discussed. The key one is comparison because Rasch considered the method of comparison fundamental to science. From the role of comparison stems scientific inference made operational by a properly developed frame of reference producing specific objectivity. The exact specifications Rasch outlined for making comparisons are explicated from quotes, and the role of causality derived from making comparisons is also examined. Understanding causality has implications for what can and cannot be produced via Rasch measurement. His simple examples were instructive, but the implications are far reaching upon first establishing the key role of comparison.

  15. Advantage of make-to-stock strategy based on linear mixed-effect model: a comparison with regression, autoregressive, times series, and exponential smoothing models

    Directory of Open Access Journals (Sweden)

    Yu-Pin Liao

    2017-11-01

    Full Text Available In the past few decades, demand forecasting has become relatively difficult due to rapid changes in the global environment. This research illustrates the use of the make-to-stock (MTS production strategy in order to explain how forecasting plays an essential role in business management. The linear mixed-effect (LME model has been extensively developed and is widely applied in various fields. However, no study has used the LME model for business forecasting. We suggest that the LME model be used as a tool for prediction and to overcome environment complexity. The data analysis is based on real data in an international display company, where the company needs accurate demand forecasting before adopting a MTS strategy. The forecasting result from the LME model is compared to the commonly used approaches, including the regression model, autoregressive model, times series model, and exponential smoothing model, with the results revealing that prediction performance provided by the LME model is more stable than using the other methods. Furthermore, product types in the data are regarded as a random effect in the LME model, hence demands of all types can be predicted simultaneously using a single LME model. However, some approaches require splitting the data into different type categories, and then predicting the type demand by establishing a model for each type. This feature also demonstrates the practicability of the LME model in real business operations.

  16. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  17. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  18. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  19. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  20. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...