WorldWideScience

Sample records for performed detailed statistical

  1. Are medical articles highlighting detailed statistics more cited?

    Directory of Open Access Journals (Sweden)

    Mike Thelwall

    2015-06-01

    Full Text Available When conducting a literature review, it is natural to search for articles and read their abstracts in order to select papers to read fully. Hence, informative abstracts are important to ensure that research is read. The description of a paper's methods may help to give confidence that a study is of high quality. This article assesses whether medical articles that mention three statistical methods, each of which is arguably indicative of a more detailed statistical analysis than average, are more highly cited. The results show that medical articles mentioning Bonferroni corrections, bootstrapping and effect size tend to be 7%, 8% and 15% more highly ranked for citations than average, respectively. Although this is consistent with the hypothesis that mentioning more detailed statistical techniques generate more highly cited research, these techniques may also tend to be used in more highly cited areas of Medicine.

  2. Statistical analysis of RHIC beam position monitors performance

    Science.gov (United States)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  3. Statistical analysis of RHIC beam position monitors performance

    Directory of Open Access Journals (Sweden)

    R. Calaga

    2004-04-01

    Full Text Available A detailed statistical analysis of beam position monitors (BPM performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  4. Statistical Analysis of Detailed 3-D CFD LES Simulations with Regard to CCV Modeling

    Directory of Open Access Journals (Sweden)

    Vítek Oldřich

    2016-06-01

    Full Text Available The paper deals with statistical analysis of large amount of detailed 3-D CFD data in terms of cycle-to-cycle variations (CCVs. These data were obtained by means of LES calculations of many consecutive cycles. Due to non-linear nature of Navier-Stokes equation set, there is a relatively significant CCV. Hence, every cycle is slightly different – this leads to requirement to perform statistical analysis based on ensemble averaging procedure which enables better understanding of CCV in ICE including its quantification. The data obtained from the averaging procedure provides results on different space resolution levels. The procedure is applied locally, i.e., in every cell of the mesh. Hence there is detailed CCV information on local level – such information can be compared with RANS simulations. Next, volume/mass averaging provides information at specific locations – e.g., gap between electrodes of a spark plug. Finally, volume/mass averaging of the whole combustion chamber leads to global information which can be compared with experimental data or results of system simulation tools (which are based on 0-D/1-D approach.

  5. A Statistical Model for Synthesis of Detailed Facial Geometry

    OpenAIRE

    Golovinskiy, Aleksey; Matusik, Wojciech; Pfister, Hanspeter; Rusinkiewicz, Szymon; Funkhouser, Thomas

    2006-01-01

    Detailed surface geometry contributes greatly to the visual realism of 3D face models. However, acquiring high-resolution face geometry is often tedious and expensive. Consequently, most face models used in games, virtual reality, or computer vision look unrealistically smooth. In this paper, we introduce a new statistical technique for the analysis and synthesis of small three-dimensional facial features, such as wrinkles and pores. We acquire high-resolution face geometry for people across ...

  6. Federal Funds for Research and Development. Fiscal Years 1982, 1983, and 1984. Volume XXXII. Detailed Statistical Tables. Surveys of Science Resources Series.

    Science.gov (United States)

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Detailed statistical tables on federal funds for research and development (R&D) are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D--agency, character of work, and performer; total research--agency, performer, and field of science; basic research--agency, performer,…

  7. Humans make efficient use of natural image statistics when performing spatial interpolation.

    Science.gov (United States)

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  8. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  9. Comparative Gender Performance in Business Statistics.

    Science.gov (United States)

    Mogull, Robert G.

    1989-01-01

    Comparative performance of male and female students in introductory and intermediate statistics classes was examined for over 16 years at a state university. Gender means from 97 classes and 1,609 males and 1,085 females revealed a probabilistic--although statistically insignificant--superior performance by female students that appeared to…

  10. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  11. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  12. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  13. Statistics Anxiety, Trait Anxiety, Learning Behavior, and Academic Performance

    Science.gov (United States)

    Macher, Daniel; Paechter, Manuela; Papousek, Ilona; Ruggeri, Kai

    2012-01-01

    The present study investigated the relationship between statistics anxiety, individual characteristics (e.g., trait anxiety and learning strategies), and academic performance. Students enrolled in a statistics course in psychology (N = 147) filled in a questionnaire on statistics anxiety, trait anxiety, interest in statistics, mathematical…

  14. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  15. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    Science.gov (United States)

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration

  16. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  17. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.

    2014-01-01

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  18. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  19. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  20. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  1. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics.

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  2. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Directory of Open Access Journals (Sweden)

    Manuela Paechter

    2017-07-01

    Full Text Available In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men. Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in

  3. Automated leak localization performance without detailed demand distribution data

    NARCIS (Netherlands)

    Moors, Janneke; Scholten, L.; van der Hoek, J.P.; den Besten, J.

    2018-01-01

    Automatic leak localization has been suggested to reduce the time and personnel efforts needed to localize
    (small) leaks. Yet, the available methods require a detailed demand distribution model for successful
    calibration and good leak localization performance. The main aim of this work was

  4. Attitude towards statistics and performance among post-graduate students

    Science.gov (United States)

    Rosli, Mira Khalisa; Maat, Siti Mistima

    2017-05-01

    For student to master Statistics is a necessity, especially for those post-graduates that are involved in the research field. The purpose of this research was to identify the attitude towards Statistics among the post-graduates and to determine the relationship between the attitude towards Statistics and post-graduates' of Faculty of Education, UKM, Bangi performance. 173 post-graduate students were chosen randomly to participate in the study. These students registered in Research Methodology II course that was introduced by faculty. A survey of attitude toward Statistics using 5-points Likert scale was used for data collection purposes. The instrument consists of four components such as affective, cognitive competency, value and difficulty. The data was analyzed using the SPSS version 22 in producing the descriptive and inferential Statistics output. The result of this research showed that there is a medium and positive relation between attitude towards statistics and students' performance. As a conclusion, educators need to access students' attitude towards the course to accomplish the learning outcomes.

  5. Detailed Performance of the Outer Tracker at LHCb

    CERN Document Server

    Tuning, N

    2014-01-01

    The LHCb Outer Tracker is a gaseous detector covering an area of 5x6m2 with 12 double layers of straw tubes. Based on data of the first LHC running period from 2010 to 2012, the performance in terms of the single hit resolution and efficiency are presented. Details on the ionization length and subtle effects regarding signal reflections and the subsequent time-walk correction are given. The efficiency to detect a hit in the central half of the straw is estimated to be 99.2%, and the position resolution is determined to be approximately 200 um, depending on the detailed implementation of the internal alignment of individual detector modules. The Outer Tracker received a dose in the hottest region corresponding to 0.12 C/cm, and no signs of gain deterioration or other ageing effects are observed.

  6. 14 CFR 298.61 - Reporting of traffic statistics.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Reporting of traffic statistics. 298.61... Requirements § 298.61 Reporting of traffic statistics. (a) Each commuter air carrier and small certificated air... statistics shall be compiled in terms of each flight stage as actually performed. The detail T-100 data shall...

  7. MO-DE-207A-01: Impact of Statistical Weights On Detection of Low-Contrast Details in Model-Based Iterative CT Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Noo, F; Guo, Z [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used with two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It

  8. MO-DE-207A-01: Impact of Statistical Weights On Detection of Low-Contrast Details in Model-Based Iterative CT Reconstruction

    International Nuclear Information System (INIS)

    Noo, F; Guo, Z

    2016-01-01

    Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used with two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It

  9. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  10. Detailed statistical analysis plan for the target temperature management after out-of-hospital cardiac arrest trial

    DEFF Research Database (Denmark)

    Nielsen, Niklas; Winkel, Per; Cronberg, Tobias

    2013-01-01

    Animal experimental studies and previous randomized trials suggest an improvement in mortality and neurological function with temperature regulation to hypothermia after cardiac arrest. According to a systematic review, previous trials were small, had a risk of bias, evaluated select populations......, and did not treat hyperthermia in the control groups. The optimal target temperature management (TTM) strategy is not known. To prevent outcome reporting bias, selective reporting and data-driven results, we present the a priori defined detailed statistical analysis plan as an update to the previously...

  11. A Model of Statistics Performance Based on Achievement Goal Theory.

    Science.gov (United States)

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  12. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Science.gov (United States)

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  13. Performing Inferential Statistics Prior to Data Collection

    Science.gov (United States)

    Trafimow, David; MacDonald, Justin A.

    2017-01-01

    Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…

  14. THESEE-3, Orgel Reactor Performance and Statistic Hot Channel Factors

    International Nuclear Information System (INIS)

    Chambaud, B.

    1974-01-01

    1 - Nature of physical problem solved: The code applies to a heavy-water moderated organic-cooled reactor channel. Different fuel cluster models can be used (circular or hexagonal patterns). The code gives coolant temperatures and velocities and cladding temperatures throughout the channel and also channel performances, such as power, outlet temperature, boiling and burn-out safety margins (see THESEE-1). In a further step, calculations are performed with statistical values obtained by random retrieval of geometrical in- put data and taking into account construction tolerances, vibrations, etc. The code evaluates the mean value and standard deviation for the more important thermal and hydraulic parameters. 2 - Method of solution: First step calculations are performed for nominal values of parameters by solving iteratively the non-linear system of equations which give the pressure drops in subchannels of the current zone (see THESEE-1). Then a Gaussian probability distribution of possible statistical values of the geometrical input data is assumed. A random number generation routine determines the statistical case. Calculations are performed in the same way as for the nominal case. In the case of several channels, statistical performances must be adjusted to equalize the normal pressure drop. A special subroutine (AVERAGE) then determines the mean value and standard deviation, and thus probability functions of the most significant thermal and hydraulic results. 3 - Restrictions on the complexity of the problem: Maximum 7 fuel clusters, each divided into 10 axial zones. Fuel bundle geometries are restricted to the following models - circular pattern 6/7, 18/19, 36/67 rods, with or without fillers. The fuel temperature distribution is not studied. The probability distribution of the statistical input is assumed to be a Gaussian function. The principle of random retrieval of statistical values is correct, but some additional correlations could be found from a more

  15. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  16. Improved custom statistics visualization for CA Performance Center data

    CERN Document Server

    Talevi, Iacopo

    2017-01-01

    The main goal of my project is to understand and experiment the possibilities that CA Performance Center (CA PC) offers for creating custom applications to display stored information through interesting visual means, such as maps. In particular, I have re-written some of the network statistics web pages in order to fetch data from new statistics modules in CA PC, which has its own API, and stop using the RRD data.

  17. Development and Performance Evaluation of Image-Based Robotic Waxing System for Detailing Automobiles.

    Science.gov (United States)

    Lin, Chi-Ying; Hsu, Bing-Cheng

    2018-05-14

    Waxing is an important aspect of automobile detailing, aimed at protecting the finish of the car and preventing rust. At present, this delicate work is conducted manually due to the need for iterative adjustments to achieve acceptable quality. This paper presents a robotic waxing system in which surface images are used to evaluate the quality of the finish. An RGB-D camera is used to build a point cloud that details the sheet metal components to enable path planning for a robot manipulator. The robot is equipped with a multi-axis force sensor to measure and control the forces involved in the application and buffing of wax. Images of sheet metal components that were waxed by experienced car detailers were analyzed using image processing algorithms. A Gaussian distribution function and its parameterized values were obtained from the images for use as a performance criterion in evaluating the quality of surfaces prepared by the robotic waxing system. Waxing force and dwell time were optimized using a mathematical model based on the image-based criterion used to measure waxing performance. Experimental results demonstrate the feasibility of the proposed robotic waxing system and image-based performance evaluation scheme.

  18. Rapid prototyping in order to improve building performance simulation for detailed design support

    NARCIS (Netherlands)

    Hopfe, C.J.; Hensen, J.L.M.; Stankov, P.

    2006-01-01

    Building performance simulation (BPS) is a powerful tool to support building and system designers in emulating how orientation, building type, HVAC system etc. interacts the overall building performance. Currently BPS is used only for code compliance in the detailed design, neither to make informed

  19. Statistical and Machine Learning Models to Predict Programming Performance

    OpenAIRE

    Bergin, Susan

    2006-01-01

    This thesis details a longitudinal study on factors that influence introductory programming success and on the development of machine learning models to predict incoming student performance. Although numerous studies have developed models to predict programming success, the models struggled to achieve high accuracy in predicting the likely performance of incoming students. Our approach overcomes this by providing a machine learning technique, using a set of three significant...

  20. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  1. Federal Funds for Research and Development: Fiscal Years 1980, 1981, and 1982. Volume XXX. Detailed Statistical Tables. Surveys of Science Resources Series.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…

  2. The ‘39 steps’: an algorithm for performing statistical analysis of data on energy intake and expenditure

    Directory of Open Access Journals (Sweden)

    John R. Speakman

    2013-03-01

    Full Text Available The epidemics of obesity and diabetes have aroused great interest in the analysis of energy balance, with the use of organisms ranging from nematode worms to humans. Although generating energy-intake or -expenditure data is relatively straightforward, the most appropriate way to analyse the data has been an issue of contention for many decades. In the last few years, a consensus has been reached regarding the best methods for analysing such data. To facilitate using these best-practice methods, we present here an algorithm that provides a step-by-step guide for analysing energy-intake or -expenditure data. The algorithm can be used to analyse data from either humans or experimental animals, such as small mammals or invertebrates. It can be used in combination with any commercial statistics package; however, to assist with analysis, we have included detailed instructions for performing each step for three popular statistics packages (SPSS, MINITAB and R. We also provide interpretations of the results obtained at each step. We hope that this algorithm will assist in the statistically appropriate analysis of such data, a field in which there has been much confusion and some controversy.

  3. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  4. Operation statistics of KEKB

    International Nuclear Information System (INIS)

    Kawasumi, Takeshi; Funakoshi, Yoshihiro

    2008-01-01

    KEKB accelerator has been operated since December 1998. We achieved the design peak luminosity of 10.00/nb/s. The present record is 17.12/nb/s. Detailed data of the KEKB Operation is important to evaluate the KEKB performance and to suggest the direction of the performance enhancement. We have classified all KEKB machine time into the following seven categories (1) Physics Run (2) Machine Study (3) Machine Tuning (4) Beam Tuning (5) Trouble (6) Maintenance (7) Others, to estimate the accelerator availability. In this paper we report the operation statistics of the KEKB accelerator. (author)

  5. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  6. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  7. Testing for detailed balance in a financial market

    Science.gov (United States)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  8. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  9. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  10. The CEO performance effect : Statistical issues and a complex fit perspective

    NARCIS (Netherlands)

    Blettner, D.P.; Chaddad, F.R.; Bettis, R.

    2012-01-01

    How CEOs affect strategy and performance is important to strategic management research. We show that sophisticated statistical analysis alone is problematic for establishing the magnitude and causes of CEO impact on performance. We discuss three problem areas that substantially distort the

  11. High performance statistical computing with parallel R: applications to biology and climate modelling

    International Nuclear Information System (INIS)

    Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth

    2006-01-01

    Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines

  12. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  13. Enabling Detailed Energy Analyses via the Technology Performance Exchange: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Studer, D.; Fleming, K.; Lee, E.; Livingood, W.

    2014-08-01

    One of the key tenets to increasing adoption of energy efficiency solutions in the built environment is improving confidence in energy performance. Current industry practices make extensive use of predictive modeling, often via the use of sophisticated hourly or sub-hourly energy simulation programs, to account for site-specific parameters (e.g., climate zone, hours of operation, and space type) and arrive at a performance estimate. While such methods are highly precise, they invariably provide less than ideal accuracy due to a lack of high-quality, foundational energy performance input data. The Technology Performance Exchange was constructed to allow the transparent sharing of foundational, product-specific energy performance data, and leverages significant, external engineering efforts and a modular architecture to efficiently identify and codify the minimum information necessary to accurately predict product energy performance. This strongly-typed database resource represents a novel solution to a difficult and established problem. One of the most exciting benefits is the way in which the Technology Performance Exchange's application programming interface has been leveraged to integrate contributed foundational data into the Building Component Library. Via a series of scripts, data is automatically translated and parsed into the Building Component Library in a format that is immediately usable to the energy modeling community. This paper (1) presents a high-level overview of the project drivers and the structure of the Technology Performance Exchange; (2) offers a detailed examination of how technologies are incorporated and translated into powerful energy modeling code snippets; and (3) examines several benefits of this robust workflow.

  14. Performance in College Chemistry: a Statistical Comparison Using Gender and Jungian Personality Type

    Science.gov (United States)

    Greene, Susan V.; Wheeler, Henry R.; Riley, Wayne D.

    This study sorted college introductory chemistry students by gender and Jungian personality type. It recognized differences from the general population distribution and statistically compared the students' grades with their Jungian personality types. Data from 577 female students indicated that ESFP (extroverted, sensory, feeling, perceiving) and ENFP (extroverted, intuitive, feeling, perceiving) profiles performed poorly at statistically significant levels when compared with the distribution of females enrolled in introductory chemistry. The comparable analysis using data from 422 male students indicated that the poorly performing male profiles were ISTP (introverted, sensory, thinking, perceiving) and ESTP (extroverted, sensory, thinking, perceiving). ESTJ (extroverted, sensory, thinking, judging) female students withdrew from the course at a statistically significant level. For both genders, INTJ (introverted, intuitive, thinking, judging) students were the best performers. By examining the documented characteristics of Jungian profiles that correspond with poorly performing students in chemistry, one may more effectively assist the learning process and the retention of these individuals in the fields of natural science, engineering, and technology.

  15. Kinetic energy budget details

    Indian Academy of Sciences (India)

    Abstract. This paper presents the detailed turbulent kinetic energy budget and higher order statistics of flow behind a surface-mounted rib with and without superimposed acoustic excitation. Pattern recognition technique is used to determine the large-scale structure magnitude. It is observed that most of the turbulence ...

  16. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Ismail, A. Vol 9, No 3S (2017): Special Issue - Articles Investigate of wave absorption performance for oil palm frond and empty fruit bunch at 5.8 GHz. Abstract PDF · Vol 9, No 3S (2017): Special Issue ...

  17. Illinois travel statistics, 2008

    Science.gov (United States)

    2009-01-01

    The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  18. Illinois travel statistics, 2009

    Science.gov (United States)

    2010-01-01

    The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  19. Illinois travel statistics, 2010

    Science.gov (United States)

    2011-01-01

    The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  20. Detailed Performance Study of ATLAS Endcap Muon Trigger with Beam Collision Data

    CERN Document Server

    Hayakawa, T

    2010-01-01

    In 2009 the first beam collision was occurred at the LHC and the ATLAS has started data taking with beam collision at s = 7 TeV since May 2010. This poster will mention the contraptions to take the beam collision data for the electronics of Level1 Endcap Muon Trigger system, and the result and detailed study of LVL1 Endcap Muon Trigger system performance with beam collision.

  1. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  2. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Can We Use Polya’s Method to Improve Students’ Performance in the Statistics Classes?

    Directory of Open Access Journals (Sweden)

    Indika Wickramasinghe

    2015-01-01

    Full Text Available In this study, Polya’s problem-solving method is introduced in a statistics class in an effort to enhance students’ performance. Teaching the method was applied to one of the two introductory-level statistics classes taught by the same instructor, and a comparison was made between the performances in the two classes. The results indicate there was a significant improvement of the students’ performance in the class in which Polya’s method was introduced.

  4. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  5. Business Statistics: A Comparison of Student Performance in Three Learning Modes

    Science.gov (United States)

    Simmons, Gerald R.

    2014-01-01

    The purpose of this study was to compare the performance of three teaching modes and age groups of business statistics sections in terms of course exam scores. The research questions were formulated to determine the performance of the students within each teaching mode, to compare each mode in terms of exam scores, and to compare exam scores by…

  6. A comparison of linear and nonlinear statistical techniques in performance attribution.

    Science.gov (United States)

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  7. Medicare and Medicaid Statistical Supplement

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics (OEDA) produced an annual Medicare and Medicaid Statistical Supplement report providing detailed statistical...

  8. Genetic relationships between detailed reproductive traits and performance traits in Holstein-Friesian dairy cattle.

    Science.gov (United States)

    Carthy, T R; Ryan, D P; Fitzgerald, A M; Evans, R D; Berry, D P

    2016-02-01

    The objective of the study was to estimate the genetic relationships between detailed reproductive traits derived from ultrasound examination of the reproductive tract and a range of performance traits in Holstein-Friesian dairy cows. The performance traits investigated included calving performance, milk production, somatic cell score (i.e., logarithm transformation of somatic cell count), carcass traits, and body-related linear type traits. Detailed reproductive traits included (1) resumed cyclicity at the time of examination, (2) multiple ovulations, (3) early ovulation, (4) heat detection, (5) ovarian cystic structures, (6) embryo loss, and (7) uterine score, measured on a 1 (little or no fluid with normal tone) to 4 (large quantity of fluid with a flaccid tone) scale, based on the tone of the uterine wall and the quantity of fluid present in the uterus. (Co)variance components were estimated using a repeatability animal linear mixed model. Genetic merit for greater milk, fat, and protein yield was associated with a reduced ability to resume cyclicity postpartum (genetic correlations ranged from -0.25 to -0.15). Higher genetic merit for milk yield was also associated with a greater genetic susceptibility to multiple ovulations. Genetic predisposition to elevated somatic cell score was associated with a decreased likelihood of cyclicity postpartum (genetic correlation of -0.32) and a greater risk of both multiple ovulations (genetic correlation of 0.25) and embryo loss (genetic correlation of 0.32). Greater body condition score was genetically associated with an increased likelihood of resumption of cyclicity postpartum (genetic correlation of 0.52). Genetically heavier, fatter carcasses with better conformation were also associated with an increased likelihood of resumed cyclicity by the time of examination (genetic correlations ranged from 0.24 to 0.41). Genetically heavier carcasses were associated with an inferior uterine score as well as a greater

  9. Handbook of Spatial Statistics

    CERN Document Server

    Gelfand, Alan E

    2010-01-01

    Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.

  10. Statistical Control Charts: Performances of Short Term Stock Trading in Croatia

    Directory of Open Access Journals (Sweden)

    Dumičić Ksenija

    2015-03-01

    Full Text Available Background: The stock exchange, as a regulated financial market, in modern economies reflects their economic development level. The stock market indicates the mood of investors in the development of a country and is an important ingredient for growth. Objectives: This paper aims to introduce an additional statistical tool used to support the decision-making process in stock trading, and it investigate the usage of statistical process control (SPC methods into the stock trading process. Methods/Approach: The individual (I, exponentially weighted moving average (EWMA and cumulative sum (CUSUM control charts were used for gaining trade signals. The open and the average prices of CROBEX10 index stocks on the Zagreb Stock Exchange were used in the analysis. The statistical control charts capabilities for stock trading in the short-run were analysed. Results: The statistical control chart analysis pointed out too many signals to buy or sell stocks. Most of them are considered as false alarms. So, the statistical control charts showed to be not so much useful in stock trading or in a portfolio analysis. Conclusions: The presence of non-normality and autocorellation has great impact on statistical control charts performances. It is assumed that if these two problems are solved, the use of statistical control charts in a portfolio analysis could be greatly improved.

  11. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    Science.gov (United States)

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Statistical analyses of the performance of Macedonian investment and pension funds

    Directory of Open Access Journals (Sweden)

    Petar Taleski

    2015-10-01

    Full Text Available The foundation of the post-modern portfolio theory is creating a portfolio based on a desired target return. This specifically applies to the performance of investment and pension funds that provide a rate of return meeting payment requirements from investment funds. A desired target return is the goal of an investment or pension fund. It is the primary benchmark used to measure performances, dynamic monitoring and evaluation of the risk–return ratio on investment funds. The analysis in this paper is based on monthly returns of Macedonian investment and pension funds (June 2011 - June 2014. Such analysis utilizes the basic, but highly informative statistical characteristic moments like skewness, kurtosis, Jarque–Bera, and Chebyishev’s Inequality. The objective of this study is to perform a trough analysis, utilizing the above mentioned and other types of statistical techniques (Sharpe, Sortino, omega, upside potential, Calmar, Sterling to draw relevant conclusions regarding the risks and characteristic moments in Macedonian investment and pension funds. Pension funds are the second largest segment of the financial system, and has great potential for further growth due to constant inflows from pension insurance. The importance of investment funds for the financial system in the Republic of Macedonia is still small, although open-end investment funds have been the fastest growing segment of the financial system. Statistical analysis has shown that pension funds have delivered a significantly positive volatility-adjusted risk premium in the analyzed period more so than investment funds.

  13. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  14. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  15. Teaching Statistics Online Using "Excel"

    Science.gov (United States)

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  16. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  17. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    Science.gov (United States)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  18. Nursing students' attitudes toward statistics: Effect of a biostatistics course and association with examination performance.

    Science.gov (United States)

    Kiekkas, Panagiotis; Panagiotarou, Aliki; Malja, Alvaro; Tahirai, Daniela; Zykai, Rountina; Bakalis, Nick; Stefanopoulos, Nikolaos

    2015-12-01

    Although statistical knowledge and skills are necessary for promoting evidence-based practice, health sciences students have expressed anxiety about statistics courses, which may hinder their learning of statistical concepts. To evaluate the effects of a biostatistics course on nursing students' attitudes toward statistics and to explore the association between these attitudes and their performance in the course examination. One-group quasi-experimental pre-test/post-test design. Undergraduate nursing students of the fifth or higher semester of studies, who attended a biostatistics course. Participants were asked to complete the pre-test and post-test forms of The Survey of Attitudes Toward Statistics (SATS)-36 scale at the beginning and end of the course respectively. Pre-test and post-test scale scores were compared, while correlations between post-test scores and participants' examination performance were estimated. Among 156 participants, post-test scores of the overall SATS-36 scale and of the Affect, Cognitive Competence, Interest and Effort components were significantly higher than pre-test ones, indicating that the course was followed by more positive attitudes toward statistics. Among 104 students who participated in the examination, higher post-test scores of the overall SATS-36 scale and of the Affect, Difficulty, Interest and Effort components were significantly but weakly correlated with higher examination performance. Students' attitudes toward statistics can be improved through appropriate biostatistics courses, while positive attitudes contribute to higher course achievements and possibly to improved statistical skills in later professional life. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Contrast-detail analysis of three flat panel detectors for digital radiography

    International Nuclear Information System (INIS)

    Borasi, Giovanni; Samei, Ehsan; Bertolini, Marco; Nitrosi, Andrea; Tassoni, Davide

    2006-01-01

    In this paper we performed a contrast detail analysis of three commercially available flat panel detectors, two based on the indirect detection mechanism (GE Revolution XQ/i, system A, and Trixell/Philips Pixium 4600, system B) and one based on the direct detection mechanism (Hologic DirectRay DR 1000, system C). The experiment was conducted using standard x-ray radiation quality and a widely used contrast-detail phantom. Images were evaluated using a four alternative forced choice paradigm on a diagnostic-quality softcopy monitor. At the low and intermediate exposures, systems A and B gave equivalent performances. At the high dose levels, system A performed better than system B in the entire range of target sizes, even though the pixel size of system A was about 40% larger than that of system B. At all the dose levels, the performances of the system C (direct system) were lower than those of system A and B (indirect systems). Theoretical analyses based on the Perception Statistical Model gave similar predicted SNR T values corresponding to an observer efficiency of about 0.08 for systems A and B and 0.05 for system C

  20. Statistical mechanics and Lorentz violation

    International Nuclear Information System (INIS)

    Colladay, Don; McDonald, Patrick

    2004-01-01

    The theory of statistical mechanics is studied in the presence of Lorentz-violating background fields. The analysis is performed using the Standard-Model Extension (SME) together with a Jaynesian formulation of statistical inference. Conventional laws of thermodynamics are obtained in the presence of a perturbed hamiltonian that contains the Lorentz-violating terms. As an example, properties of the nonrelativistic ideal gas are calculated in detail. To lowest order in Lorentz violation, the scalar thermodynamic variables are only corrected by a rotationally invariant combination of parameters that mimics a (frame dependent) effective mass. Spin-couplings can induce a temperature-independent polarization in the classical gas that is not present in the conventional case. Precision measurements in the residual expectation values of the magnetic moment of Fermi gases in the limit of high temperature may provide interesting limits on these parameters

  1. Electricity Statistics for France. Definitive results for the year 2015

    International Nuclear Information System (INIS)

    2016-01-01

    The mission of RTE, the French power transmission system operator, a public service assignment, is to balance the electricity supply and demand in real time. This report presents some detailed statistics on electricity flows in France, on electricity market mechanism and on facilities: consumption, generation, trade, RTE's network performance and evolution with respect to the previous year

  2. Electricity Statistics for France. Definitive results for the year 2013

    International Nuclear Information System (INIS)

    2014-01-01

    The mission of RTE, the French power transmission system operator, a public service assignment, is to balance the electricity supply and demand in real time. This report presents some detailed statistics on electricity flows in France, on electricity market mechanism and on facilities: consumption, generation, trade, RTE's network performance and evolution with respect to the previous year

  3. Exploring Statistics Anxiety: Contrasting Mathematical, Academic Performance and Trait Psychological Predictors

    Science.gov (United States)

    Bourne, Victoria J.

    2018-01-01

    Statistics anxiety is experienced by a large number of psychology students, and previous research has examined a range of potential correlates, including academic performance, mathematical ability and psychological predictors. These varying predictors are often considered separately, although there may be shared variance between them. In the…

  4. Statistical Analysis of EGFR Structures’ Performance in Virtual Screening

    Science.gov (United States)

    Li, Yan; Li, Xiang; Dong, Zigang

    2015-01-01

    In this work the ability of EGFR structures to distinguish true inhibitors from decoys in docking and MM-PBSA is assessed by statistical procedures. The docking performance depends critically on the receptor conformation and bound state. The enrichment of known inhibitors is well correlated with the difference between EGFR structures rather than the bound-ligand property. The optimal structures for virtual screening can be selected based purely on the complex information. And the mixed combination of distinct EGFR conformations is recommended for ensemble docking. In MM-PBSA, a variety of EGFR structures have identically good performance in the scoring and ranking of known inhibitors, indicating that the choice of the receptor structure has little effect on the screening. PMID:26476847

  5. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  6. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  7. European downstream oil industry safety performance. Statistical summary of reported incidents 2009

    International Nuclear Information System (INIS)

    Burton, A.; Den Haan, K.H.

    2010-10-01

    The sixteenth such report by CONCAWE, this issue includes statistics on workrelated personal injuries for the European downstream oil industry's own employees as well as contractors for the year 2009. Data were received from 33 companies representing more than 97% of the European refining capacity. Trends over the last sixteen years are highlighted and the data are also compared to similar statistics from related industries. In addition, this report presents the results of the first Process Safety Performance Indicator data gathering exercise amongst the CONCAWE membership.

  8. Statistical Diagnosis Method of Conductor Motions in Superconducting Magnets to Predict their Quench Performance

    CERN Document Server

    Khomenko, B A; Rijllart, A; Sanfilippo, S; Siemko, A

    2001-01-01

    Premature training quenches are usually caused by the transient energy released within the magnet coil as it is energised. Two distinct varieties of disturbances exist. They are thought to be electrical and mechanical in origin. The first type of disturbance comes from non-uniform current distribution in superconducting cables whereas the second one usually originates from conductor motions or micro-fractures of insulating materials under the action of Lorentz forces. All of these mechanical events produce in general a rapid variation of the voltages in the so-called quench antennas and across the magnet coil, called spikes. A statistical method to treat the spatial localisation and the time occurrence of spikes will be presented. It allows identification of the mechanical weak points in the magnet without need to increase the current to provoke a quench. The prediction of the quench level from detailed analysis of the spike statistics can be expected.

  9. Probabilistic evaluation of fuel element performance by the combined use of a fast running simplistic and a detailed deterministic fuel performance code

    International Nuclear Information System (INIS)

    Misfeldt, I.

    1980-01-01

    A comprehensive evaluation of fuel element performance requires a probabilistic fuel code supported by a well bench-marked deterministic code. This paper presents an analysis of a SGHWR ramp experiment, where the probabilistic fuel code FRP is utilized in combination with the deterministic fuel models FFRS and SLEUTH/SEER. The statistical methods employed in FRP are Monte Carlo simulation or a low-order Taylor approximation. The fast-running simplistic fuel code FFRS is used for the deterministic simulations, whereas simulations with SLEUTH/SEER are used to verify the predictions of FFRS. The ramp test was performed with a SGHWR fuel element, where 9 of the 36 fuel pins failed. There seemed to be good agreement between the deterministic simulations and the experiment, but the statistical evaluation shows that the uncertainty on the important performance parameters is too large for this ''nice'' result. The analysis does therefore indicate a discrepancy between the experiment and the deterministic code predictions. Possible explanations for this disagreement are discussed. (author)

  10. The Concise Encyclopedia of Statistics

    CERN Document Server

    Dodge, Yadolah

    2008-01-01

    The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,

  11. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  12. Detailed Performance Assessment for the ITER ECE Diagnostic

    Science.gov (United States)

    Rowan, W.; Austin, M.; Houshmandyar, S.; Phillips, P.; Beno, J.; Bryant, A.; Ouroua, A.; Weeks, D.; Hubbard, A.; Taylor, G.

    2017-10-01

    One of the primary diagnostics for electron temperature (Te) measurement on ITER is based on the detection of electron cyclotron emission (ECE) Here we describe the predicted performance of the newly completed ECE diagnostic design by quantitatively following the emission from the plasma to the instruments and including the calibration method to assess accuracy. Operation of the diagnostic at 5.3 T is the main interest here but critical features of the emission spectra for 2.65 T and 1.8 T will be described. ECE will be collected by two very similar optical systems: one a radial view, the other an oblique view. Both measurements are used for Te while the oblique view also allows detection of non-thermal distortion in the electron distribution. An in-vacuum calibration source is included in the front end of each view to calibrate out the effect of any degradation of in-vessel optics. Following collection, the emission is split into orthogonal polarizations and transmitted to the detection instruments via waveguides filled with dry nitrogen, a choice that simplifies construction and analysis. Near the instruments, a switchyard is used to select which polarization and view is detected by each instrument. The design for the radiometer used for 5.3 T will be described in detail. Supported by PPPL/US-DA via subcontract S013464-H to UT Austin.

  13. Detailed noise statistics for an optically preamplified direct detection receiver

    DEFF Research Database (Denmark)

    Danielsen, Søren Lykke; Mikkelsen, Benny; Durhuus, Terji

    1995-01-01

    We describe the exact statistics of an optically preamplified direct detection receiver by means of the moment generating function. The theory allows an arbitrary shaped electrical filter in the receiver circuit. The moment generating function (MGF) allows for a precise calculation of the error...... rate by using the inverse Fast Fourier transform (FFT). The exact results are compared with the usual Gaussian approximation (GA), the saddlepoint approximation (SAP) and the modified Chernoff bound (MCB). This comparison shows that the noise is not Gaussian distributed for all values of the optical...... and calculate the sensitivity degradation due to inter symbol interference (ISI)...

  14. Evaluation of surface detail reproduction, dimensional stability and gypsum compatibility of monophase polyvinyl-siloxane and polyether elastomeric impression materials under dry and moist conditions.

    Science.gov (United States)

    Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N Suman; Tadi, Durga Prasad

    2016-01-01

    This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In

  15. Using the Expectancy Value Model of Motivation to Understand the Relationship between Student Attitudes and Achievement in Statistics

    Science.gov (United States)

    Hood, Michelle; Creed, Peter A.; Neumann, David L.

    2012-01-01

    We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…

  16. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  17. Introduction to applied Bayesian statistics and estimation for social scientists

    CERN Document Server

    Lynch, Scott M

    2007-01-01

    ""Introduction to Applied Bayesian Statistics and Estimation for Social Scientists"" covers the complete process of Bayesian statistical analysis in great detail from the development of a model through the process of making statistical inference. The key feature of this book is that it covers models that are most commonly used in social science research - including the linear regression model, generalized linear models, hierarchical models, and multivariate regression models - and it thoroughly develops each real-data example in painstaking detail.The first part of the book provides a detailed

  18. The Relationship between Test Anxiety and Academic Performance of Students in Vital Statistics Course

    Directory of Open Access Journals (Sweden)

    Shirin Iranfar

    2013-12-01

    Full Text Available Introduction: Test anxiety is a common phenomenon among students and is one of the problems of educational system. The present study was conducted to investigate the test anxiety in vital statistics course and its association with academic performance of students at Kermanshah University of Medical Sciences. This study was descriptive-analytical and the study sample included the students studying in nursing and midwifery, paramedicine and health faculties that had taken vital statistics course and were selected through census method. Sarason questionnaire was used to analyze the test anxiety. Data were analyzed by descriptive and inferential statistics. The findings indicated no significant correlation between test anxiety and score of vital statistics course.

  19. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

  20. Performance studies of GooFit on GPUs vs RooFit on CPUs while estimating the statistical significance of a new physical signal

    Science.gov (United States)

    Di Florio, Adriano

    2017-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  1. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  2. Performance of Generating Plant: Managing the Changes. Part 2: Thermal Generating Plant Unavailability Factors and Availability Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Curley, G. Michael [North American Electric Reliability Corporation (United States); Mandula, Jiri [International Atomic Energy Agency (IAEA)

    2008-05-15

    The WEC Committee on the Performance of Generating Plant (PGP) has been collecting and analysing power plant performance statistics worldwide for more than 30 years and has produced regular reports, which include examples of advanced techniques and methods for improving power plant performance through benchmarking. A series of reports from the various working groups was issued in 2008. This reference presents the results of Working Group 2 (WG2). WG2's main task is to facilitate the collection and input on an annual basis of power plant performance data (unit-by-unit and aggregated data) into the WEC PGP database. The statistics will be collected for steam, nuclear, gas turbine and combined cycle, hydro and pump storage plant. WG2 will also oversee the ongoing development of the availability statistics database, including the contents, the required software, security issues and other important information. The report is divided into two sections: Thermal generating, combined cycle/co-generation, combustion turbine, hydro and pumped storage unavailability factors and availability statistics; and nuclear power generating units.

  3. Magnetic resonance imaging of the wrist: Diagnostic performance statistics

    International Nuclear Information System (INIS)

    Hobby, Jonathan L.; Tom, Brian D.M.; Bearcroft, Philip W.P.; Dixon, Adrian K.

    2001-01-01

    AIM: To review the published diagnostic performance statistics for magnetic resonance imaging (MRI) of the wrist for tears of the triangular fibrocartilage complex, the intrinsic carpal ligaments, and for osteonecrosis of the carpal bones. MATERIALS AND METHODS: We used Medline and Embase to search the English language literature. Studies evaluating the diagnostic performance of MRI of the wrist in living patients with surgical confirmation of MR findings were identified. RESULTS: We identified 11 studies reporting the diagnostic performance of MRI for tears of the triangular fibrocartilage complex for a total of 410 patients, six studies for the scapho-lunate ligament (159 patients), six studies for the luno-triquetral ligament (142 patients) and four studies (56 patients) for osteonecrosis of the carpal bones. CONCLUSIONS: Magnetic resonance imaging is an accurate means of diagnosing tears of the triangular fibrocartilage and carpal osteonecrosis. Although MRI is highly specific for tears of the intrinsic carpal ligaments, its sensitivity is low. The diagnostic performance of MRI in the wrist is improved by using high-resolution T2* weighted 3D gradient echo sequences. Using current imaging techniques without intra-articular contrast medium, magnetic resonance imaging cannot reliably exclude tears of the intrinsic carpal ligaments. Hobby, J.L. (2001)

  4. Statistics in the Computer Age

    DEFF Research Database (Denmark)

    Tjur, Tue

    2011-01-01

    It is a trivial observation that the computers have changed the way statistics is practiced. But has it also changed the theory of statistics and the way we teach it? I think yes—even if the changes appear to be surprisingly small in some contexts. This is an attempt to give a more detailed answer...

  5. Electrical Energy Statistics for France. Definitive results for the year 2012 - Synthesis

    International Nuclear Information System (INIS)

    2013-01-01

    The mission of RTE, the French electricity Transportation grid, a public service assignment, is to balance the electricity supply and demand in real time. This report presents some detailed statistics on electricity flows in France, on electricity market mechanism and on facilities: consumption, generation, trade, RTE's network performance and evolution with respect to the previous year

  6. Examining the Performance of Statistical Downscaling Methods: Toward Matching Applications to Data Products

    Science.gov (United States)

    Dixon, K. W.; Lanzante, J. R.; Adams-Smith, D.

    2017-12-01

    Several challenges exist when seeking to use future climate model projections in a climate impacts study. A not uncommon approach is to utilize climate projection data sets derived from more than one future emissions scenario and from multiple global climate models (GCMs). The range of future climate responses represented in the set is sometimes taken to be indicative of levels of uncertainty in the projections. Yet, GCM outputs are deemed to be unsuitable for direct use in many climate impacts applications. GCM grids typically are viewed as being too coarse. Additionally, regional or local-scale biases in a GCM's simulation of the contemporary climate that may not be problematic from a global climate modeling perspective may be unacceptably large for a climate impacts application. Statistical downscaling (SD) of climate projections - a type of post-processing that uses observations to inform the refinement of GCM projections - is often used in an attempt to account for GCM biases and to provide additional spatial detail. "What downscaled climate projection is the best one to use" is a frequently asked question, but one that is not always easy to answer, as it can be dependent on stakeholder needs and expectations. Here we present results from a perfect model experimental design illustrating how SD method performance can vary not only by SD method, but how performance can also vary by location, season, climate variable of interest, amount of projected climate change, SD configuration choices, and whether one is interested in central tendencies or the tails of the distribution. Awareness of these factors can be helpful when seeking to determine the suitability of downscaled climate projections for specific climate impacts applications. It also points to the potential value of considering more than one SD data product in a study, so as to acknowledge uncertainties associated with the strengths and weaknesses of different downscaling methods.

  7. Statistical performance evaluation of ECG transmission using wireless networks.

    Science.gov (United States)

    Shakhatreh, Walid; Gharaibeh, Khaled; Al-Zaben, Awad

    2013-07-01

    This paper presents simulation of the transmission of biomedical signals (using ECG signal as an example) over wireless networks. Investigation of the effect of channel impairments including SNR, pathloss exponent, path delay and network impairments such as packet loss probability; on the diagnosability of the received ECG signal are presented. The ECG signal is transmitted through a wireless network system composed of two communication protocols; an 802.15.4- ZigBee protocol and an 802.11b protocol. The performance of the transmission is evaluated using higher order statistics parameters such as kurtosis and Negative Entropy in addition to the common techniques such as the PRD, RMS and Cross Correlation.

  8. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  9. Statistical analysis of the determinations of the Sun's Galactocentric distance

    Science.gov (United States)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  10. Introduction to Statistics for Biomedical Engineers

    CERN Document Server

    Ropella, Kristina

    2007-01-01

    There are many books written about statistics, some brief, some detailed, some humorous, some colorful, and some quite dry. Each of these texts is designed for a specific audience. Too often, texts about statistics have been rather theoretical and intimidating for those not practicing statistical analysis on a routine basis. Thus, many engineers and scientists, who need to use statistics much more frequently than calculus or differential equations, lack sufficient knowledge of the use of statistics. The audience that is addressed in this text is the university-level biomedical engineering stud

  11. Global health business: the production and performativity of statistics in Sierra Leone and Germany.

    Science.gov (United States)

    Erikson, Susan L

    2012-01-01

    The global push for health statistics and electronic digital health information systems is about more than tracking health incidence and prevalence. It is also experienced on the ground as means to develop and maintain particular norms of health business, knowledge, and decision- and profit-making that are not innocent. Statistics make possible audit and accountability logics that undergird the management of health at a distance and that are increasingly necessary to the business of health. Health statistics are inextricable from their social milieus, yet as business artifacts they operate as if they are freely formed, objectively originated, and accurate. This article explicates health statistics as cultural forms and shows how they have been produced and performed in two very different countries: Sierra Leone and Germany. In both familiar and surprising ways, this article shows how statistics and their pursuit organize and discipline human behavior, constitute subject positions, and reify existing relations of power.

  12. Einstein's statistical mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Baracca, A; Rechtman S, R

    1985-08-01

    The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject.

  13. Einstein's statistical mechanics

    International Nuclear Information System (INIS)

    Baracca, A.; Rechtman S, R.

    1985-01-01

    The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject. (author)

  14. A statistical approach to nuclear fuel design and performance

    Science.gov (United States)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance

  15. Certification of medical librarians, 1949--1977 statistical analysis.

    Science.gov (United States)

    Schmidt, D

    1979-01-01

    The Medical Library Association's Code for Training and Certification of Medical Librarians was in effect from 1949 to August 1977, a period during which 3,216 individuals were certified. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on each type of certificate granted each year are provided. Because 54.5% of those granted certification were awarded it in the last three-year, two-month period of the code's existence, these applications are reviewed in greater detail. Statistics on MLA membership, sex, residence, library school, and method of meeting requirements are detailed. Questions relating to certification under the code now in existence are raised.

  16. Long-Term Propagation Statistics and Availability Performance Assessment for Simulated Terrestrial Hybrid FSO/RF System

    Directory of Open Access Journals (Sweden)

    Fiser Ondrej

    2011-01-01

    Full Text Available Long-term monthly and annual statistics of the attenuation of electromagnetic waves that have been obtained from 6 years of measurements on a free space optical path, 853 meters long, with a wavelength of 850 nm and on a precisely parallel radio path with a frequency of 58 GHz are presented. All the attenuation events observed are systematically classified according to the hydrometeor type causing the particular event. Monthly and yearly propagation statistics on the free space optical path and radio path are obtained. The influence of individual hydrometeors on attenuation is analysed. The obtained propagation statistics are compared to the calculated statistics using ITU-R models. The calculated attenuation statistics both at 850 nm and 58 GHz underestimate the measured statistics for higher attenuation levels. The availability performance of a simulated hybrid FSO/RF system is analysed based on the measured data.

  17. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  18. Implementing academic detailing for breast cancer screening in underserved communities

    Directory of Open Access Journals (Sweden)

    Ashford Alfred R

    2007-12-01

    Full Text Available Abstract Background African American and Hispanic women, such as those living in the northern Manhattan and the South Bronx neighborhoods of New York City, are generally underserved with regard to breast cancer prevention and screening practices, even though they are more likely to die of breast cancer than are other women. Primary care physicians (PCPs are critical for the recommendation of breast cancer screening to their patients. Academic detailing is a promising strategy for improving PCP performance in recommending breast cancer screening, yet little is known about the effects of academic detailing on breast cancer screening among physicians who practice in medically underserved areas. We assessed the effectiveness of an enhanced, multi-component academic detailing intervention in increasing recommendations for breast cancer screening within a sample of community-based urban physicians. Methods Two medically underserved communities were matched and randomized to intervention and control arms. Ninety-four primary care community (i.e., not hospital based physicians in northern Manhattan were compared to 74 physicians in the South Bronx neighborhoods of the New York City metropolitan area. Intervention participants received enhanced physician-directed academic detailing, using the American Cancer Society guidelines for the early detection of breast cancer. Control group physicians received no intervention. We conducted interviews to measure primary care physicians' self-reported recommendation of mammography and Clinical Breast Examination (CBE, and whether PCPs taught women how to perform breast self examination (BSE. Results Using multivariate analyses, we found a statistically significant intervention effect on the recommendation of CBE to women patients age 40 and over; mammography and breast self examination reports increased across both arms from baseline to follow-up, according to physician self-report. At post-test, physician

  19. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms

    International Nuclear Information System (INIS)

    Tang Jie; Nett, Brian E; Chen Guanghong

    2009-01-01

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  20. Telugu dependency parsing using different statistical parsers

    Directory of Open Access Journals (Sweden)

    B. Venkata Seshu Kumari

    2017-01-01

    Full Text Available In this paper we explore different statistical dependency parsers for parsing Telugu. We consider five popular dependency parsers namely, MaltParser, MSTParser, TurboParser, ZPar and Easy-First Parser. We experiment with different parser and feature settings and show the impact of different settings. We also provide a detailed analysis of the performance of all the parsers on major dependency labels. We report our results on test data of Telugu dependency treebank provided in the ICON 2010 tools contest on Indian languages dependency parsing. We obtain state-of-the art performance of 91.8% in unlabeled attachment score and 70.0% in labeled attachment score. To the best of our knowledge ours is the only work which explored all the five popular dependency parsers and compared the performance under different feature settings for Telugu.

  1. Multivariate statistical methods a first course

    CERN Document Server

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  2. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  3. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  4. Research on the Hotel Image Based on the Detail Service

    Science.gov (United States)

    Li, Ban; Shenghua, Zheng; He, Yi

    Detail service management, initially developed as marketing programs to enhance customer loyalty, has now become an important part of customer relation strategy. This paper analyzes the critical factors of detail service and its influence on the hotel image. We establish the theoretical model of influencing factors on hotel image and propose corresponding hypotheses. We use applying statistical method to test and verify the above-mentioned hypotheses. This paper provides a foundation for further study of detail service design and planning issues.

  5. Straightforward statistics understanding the tools of research

    CERN Document Server

    Geher, Glenn

    2014-01-01

    Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc

  6. Fractional statistics and quantum theory

    CERN Document Server

    Khare, Avinash

    1997-01-01

    This book explains the subtleties of quantum statistical mechanics in lower dimensions and their possible ramifications in quantum theory. The discussion is at a pedagogical level and is addressed to both graduate students and advanced research workers with a reasonable background in quantum and statistical mechanics. The main emphasis will be on explaining new concepts. Topics in the first part of the book includes the flux tube model of anyons, the braid group and quantum and statistical mechanics of noninteracting anyon gas. The second part of the book provides a detailed discussion about f

  7. Transportation Statistics Annual Report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these

  8. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  9. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  10. Statistical and Visualization Data Mining Tools for Foundry Production

    Directory of Open Access Journals (Sweden)

    M. Perzyk

    2007-07-01

    Full Text Available In recent years a rapid development of a new, interdisciplinary knowledge area, called data mining, is observed. Its main task is extracting useful information from previously collected large amount of data. The main possibilities and potential applications of data mining in manufacturing industry are characterized. The main types of data mining techniques are briefly discussed, including statistical, artificial intelligence, data base and visualization tools. The statistical methods and visualization methods are presented in more detail, showing their general possibilities, advantages as well as characteristic examples of applications in foundry production. Results of the author’s research are presented, aimed at validation of selected statistical tools which can be easily and effectively used in manufacturing industry. A performance analysis of ANOVA and contingency tables based methods, dedicated for determination of the most significant process parameters as well as for detection of possible interactions among them, has been made. Several numerical tests have been performed using simulated data sets, with assumed hidden relationships as well some real data, related to the strength of ductile cast iron, collected in a foundry. It is concluded that the statistical methods offer relatively easy and fairly reliable tools for extraction of that type of knowledge about foundry manufacturing processes. However, further research is needed, aimed at explanation of some imperfections of the investigated tools as well assessment of their validity for more complex tasks.

  11. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1980-01-01

    This paper presents a method for performing a statistical steady state thermal analysis of a reactor core. The technique is only outlined here since detailed thermal equations are dependent on the core geometry. The method has been applied to a pressurised water reactor core and the results are presented for illustration purposes. Random hypothetical cores are generated using the Monte-Carlo method. The technique shows that by splitting the parameters into two types, denoted core-wise and in-core, the Monte Carlo method may be used inexpensively. The idea of using extremal statistics to characterise the low probability events (i.e. the tails of a distribution) is introduced together with a method of forming the final probability distribution. After establishing an acceptable probability of exceeding a thermal design criterion, the final probability distribution may be used to determine the corresponding thermal response value. If statistical and deterministic (i.e. conservative) thermal response values are compared, information on the degree of pessimism in the deterministic method of analysis may be inferred and the restrictive performance limitations imposed by this method relieved. (orig.)

  12. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...

  13. Effects of Concept Mapping Strategy on Learning Performance in Business and Economics Statistics

    Science.gov (United States)

    Chiou, Chei-Chang

    2009-01-01

    A concept map (CM) is a hierarchically arranged, graphic representation of the relationships among concepts. Concept mapping (CMING) is the process of constructing a CM. This paper examines whether a CMING strategy can be useful in helping students to improve their learning performance in a business and economics statistics course. A single…

  14. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    Science.gov (United States)

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https

  15. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    Science.gov (United States)

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  16. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  17. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  18. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    Science.gov (United States)

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Using Microsoft Excel to Generate Usage Statistics

    Science.gov (United States)

    Spellman, Rosemary

    2011-01-01

    At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…

  20. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    Science.gov (United States)

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  1. Predicting energy performance of a net-zero energy building: A statistical approach

    International Nuclear Information System (INIS)

    Kneifel, Joshua; Webb, David

    2016-01-01

    Highlights: • A regression model is applied to actual energy data from a net-zero energy building. • The model is validated through a rigorous statistical analysis. • Comparisons are made between model predictions and those of a physics-based model. • The model is a viable baseline for evaluating future models from the energy data. - Abstract: Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid Climate Zone, and compares these

  2. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  3. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  4. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  5. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  6. Can quantum coherent solar cells break detailed balance?

    International Nuclear Information System (INIS)

    Kirk, Alexander P.

    2015-01-01

    Carefully engineered coherent quantum states have been proposed as a design attribute that is hypothesized to enable solar photovoltaic cells to break the detailed balance (or radiative) limit of power conversion efficiency by possibly causing radiative recombination to be suppressed. However, in full compliance with the principles of statistical mechanics and the laws of thermodynamics, specially prepared coherent quantum states do not allow a solar photovoltaic cell—a quantum threshold energy conversion device—to exceed the detailed balance limit of power conversion efficiency. At the condition given by steady-state open circuit operation with zero nonradiative recombination, the photon absorption rate (or carrier photogeneration rate) must balance the photon emission rate (or carrier radiative recombination rate) thus ensuring that detailed balance prevails. Quantum state transitions, entropy-generating hot carrier relaxation, and photon absorption and emission rate balancing are employed holistically and self-consistently along with calculations of current density, voltage, and power conversion efficiency to explain why detailed balance may not be violated in solar photovoltaic cells

  7. Using the expected detection delay to assess the performance of different multivariate statistical process monitoring methods for multiplicative and drift faults.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Peng, Kaixiang

    2017-03-01

    Using the expected detection delay (EDD) index to measure the performance of multivariate statistical process monitoring (MSPM) methods for constant additive faults have been recently developed. This paper, based on a statistical investigation of the T 2 - and Q-test statistics, extends the EDD index to the multiplicative and drift fault cases. As well, it is used to assess the performance of common MSPM methods that adopt these two test statistics. Based on how to use the measurement space, these methods can be divided into two groups, those which consider the complete measurement space, for example, principal component analysis-based methods, and those which only consider some subspace that reflects changes in key performance indicators, such as partial least squares-based methods. Furthermore, a generic form for them to use T 2 - and Q-test statistics are given. With the extended EDD index, the performance of these methods to detect drift and multiplicative faults is assessed using both numerical simulations and the Tennessee Eastman process. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  9. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  10. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  11. Statistical Review of Global LP Gas 2002

    International Nuclear Information System (INIS)

    2002-01-01

    This review provides essential production and consumption data from 1991 through 2001. A detailed breakdown of supply and sector demand is given for the year 2001 and historic data on international trade, shipping and pricing is also shown. Statistics pertaining to auto-gas are also included in this edition of Statistical Review of Global LP Gas 2001. (author)

  12. Statistical review of global LP gas 2001

    International Nuclear Information System (INIS)

    2001-01-01

    This review provides essential production and consumption data from 1990 through 2000. A more detailed breakdown of supply and sector demand is given for the year 2000 and historic data on international trade, shipping and pricing is also shown. Statistics pertaining to auto-gas are also included in this edition of Statistical Review of Global LP Gas 2001. (author)

  13. Resolution requirements for monitor viewing of digital flat-panel detector radiographs: a contrast detail analysis

    International Nuclear Information System (INIS)

    Peer, Siegfried; Giacomuzzi, Salvatore M.; Peer, Regina; Gassner, Eva; Steingruber, Iris; Jaschke, Werner

    2003-01-01

    With the introduction of digital flat-panel detector systems into clinical practice, the still unresolved question of resolution requirements for picture archiving communication system (PACS) workstation monitors has gained new momentum. This contrast detail analysis was thus performed to define the differences in observer performance in the detection of small low-contrast objects on clinical 1K and 2K monitor workstations. Images of the CDRAD 2.0 phantom were acquired at varying exposures on an indirect-type digital flat-panel detector. Three observers evaluated a total of 15 images each with respect to the threshold contrast for each detail size. The numbers of correctly identified objects were determined for all image subsets. No significant difference in the correct detection ratio was detected among the observers; however, the difference between the two types of workstations (1K vs 2K monitors) despite less than 3% was significant at a 95% confidence level. Slight but statistically significant differences exist in the detection of low-contrast nodular details visualized on 1K- and 2K-monitor workstations. Further work is needed to see if this result holds true also for comparison of clinical flat-panel detector images and may, for example, exert an influence on the diagnostic accuracy of chest X-ray readings. (orig.)

  14. Average wind statistics for SRP area meteorological towers

    International Nuclear Information System (INIS)

    Laurinat, J.E.

    1987-01-01

    A quality assured set of average wind Statistics for the seven SRP area meteorological towers has been calculated for the five-year period 1982--1986 at the request of DOE/SR. A Similar set of statistics was previously compiled for the years 1975-- 1979. The updated wind statistics will replace the old statistics as the meteorological input for calculating atmospheric radionuclide doses from stack releases, and will be used in the annual environmental report. This report details the methods used to average the wind statistics and to screen out bad measurements and presents wind roses generated by the averaged statistics

  15. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  16. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  17. LHCb: Statistical Comparison of CPU performance for LHCb applications on the Grid

    CERN Multimedia

    Graciani, R

    2009-01-01

    The usage of CPU resources by LHCb on the Grid id dominated by two different applications: Gauss and Brunel. Gauss the application doing the Monte Carlo simulation of proton-proton collisions. Brunel is the application responsible for the reconstruction of the signals recorded by the detector converting them into objects that can be used for later physics analysis of the data (tracks, clusters,…) Both applications are based on the Gaudi and LHCb software frameworks. Gauss uses Pythia and Geant as underlying libraries for the simulation of the collision and the later passage of the generated particles through the LHCb detector. While Brunel makes use of LHCb specific code to process the data from each sub-detector. Both applications are CPU bound. Large Monte Carlo productions or data reconstructions running on the Grid are an ideal benchmark to compare the performance of the different CPU models for each case. Since the processed events are only statistically comparable, only statistical comparison of the...

  18. Usage statistics and demonstrator services

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    An understanding of the use of repositories and their contents is clearly desirable for authors and repository managers alike, as well as those who are analysing the state of scholarly communications. A number of individual initiatives have produced statistics of variious kinds for individual repositories, but the real challenge is to produce statistics that can be collected and compared transparently on a global scale. This presentation details the steps to be taken to address the issues to attain this capability View Les Carr's biography

  19. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  20. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  1. E-detailing: information technology applied to pharmaceutical detailing.

    Science.gov (United States)

    Montoya, Isaac D

    2008-11-01

    E-detailing can be best described as the use of information technology in the field of pharmaceutical detailing. It is becoming highly popular among pharmaceutical companies because it maximizes the time of the sales force, cuts down the cost of detailing and increases physician prescribing. Thus, the application of information technology is proving to be beneficial to both physicians and pharmaceutical companies. When e-detailing was introduced in 1996, it was limited to the US; however, numerous other countries soon adopted this novel approach to detailing and now it is popular in many developed nations. The objective of this paper is to demonstrate the rapid growth of e-detailing in the field of pharmaceutical marketing. A review of e-detailing literature was conducted in addition to personal conversations with physicians. E-detailing has the potential to reduce marketing costs, increase accessibility to physicians and offer many of the advantages of face-to-face detailing. E-detailing is gaining acceptance among physicians because they can access the information of a pharmaceutical product at their own time and convenience. However, the drug safety aspect of e-detailing has not been examined and e-detailing remains a supplement to traditional detailing and is not yet a replacement to it.

  2. Evaluation of outgassing, tear strength, and detail reproduction in alginate substitute materials.

    Science.gov (United States)

    Baxter, R T; Lawson, N C; Cakir, D; Beck, P; Ramp, L C; Burgess, J O

    2012-01-01

    To compare three alginate substitute materials to an alginate impression material for cast surface porosity (outgassing), tear strength, and detail reproduction. Detail reproduction tests were performed following American National Standards Institute/American Dental Association (ANSI/ADA) Specification No. 19. To measure tear strength, 12 samples of each material were made using a split mold, placed in a water bath until testing, and loaded in tension until failure at a rate of 500 mm/min using a universal testing machine. For cast surface porosity testing, five impressions of a Teflon mold with each material were placed in a water bath (37.8°C) for the in-mouth setting time and poured with vacuum-mixed Silky Rock die stone at 5, 10, 30, and 60 minutes from the start of mixing. The gypsum samples were analyzed with a digital microscope for surface porosity indicative of hydrogen gas release by comparing the surface obtained at each interval with four casts representing no, little, some, and significant porosity. Data analysis was performed using parametric and Kruskal-Wallis analysis of variance (ANOVA), Tukey/Kramer post-hoc tests (α=0.05), and individual Mann-Whitney U tests (α=0.0167). All alginate substitute materials passed the detail reproduction test. Tear strength of the alginate substitute materials was significantly better than alginate and formed three statistically different groups: AlgiNot had the lowest tear strength, Algin-X Ultra had the highest tear strength, and Position Penta Quick had intermediate tear strength. Significant variation in outgassing existed between materials and pouring times (palginate substitute materials exhibited the least outgassing and cast porosity 60 minutes after mixing. Detail reproduction and tear strength of alginate substitute materials were superior to traditional alginate. The outgassing effect was minimal for most materials tested. Alginate substitute materials are superior replacements for irreversible

  3. Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance

    Science.gov (United States)

    Whitley, Cameron T.; Dietz, Thomas

    2018-01-01

    Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…

  4. Understanding the structure of skill through a detailed analysis of Individuals' performance on the Space Fortress game.

    Science.gov (United States)

    Towne, Tyler J; Boot, Walter R; Ericsson, K Anders

    2016-09-01

    In this paper we describe a novel approach to the study of individual differences in acquired skilled performance in complex laboratory tasks based on an extension of the methodology of the expert-performance approach (Ericsson & Smith, 1991) to shorter periods of training and practice. In contrast to more traditional approaches that study the average performance of groups of participants, we explored detailed behavioral changes for individual participants across their development on the Space Fortress game. We focused on dramatic individual differences in learning and skill acquisition at the individual level by analyzing the archival game data of several interesting players to uncover the specific structure of their acquired skill. Our analysis revealed that even after maximal values for game-generated subscores were reached, the most skilled participant's behaviors such as his flight path, missile firing, and mine handling continued to be refined and improved (Participant 17 from Boot et al., 2010). We contrasted this participant's behavior with the behavior of several other participants and found striking differences in the structure of their performance, which calls into question the appropriateness of averaging their data. For example, some participants engaged in different control strategies such as "world wrapping" or maintaining a finely-tuned circular flight path around the fortress (in contrast to Participant 17's angular flight path). In light of these differences, we raise fundamental questions about how skill acquisition for individual participants should be studied and described. Our data suggest that a detailed analysis of individuals' data is an essential step for generating a general theory of skill acquisition that explains improvement at the group and individual levels. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Statistical mechanics of cellular automata

    International Nuclear Information System (INIS)

    Wolfram, S.

    1983-01-01

    Cellular automata are used as simple mathematical models to investigate self-organization in statistical mechanics. A detailed analysis is given of ''elementary'' cellular automata consisting of a sequence of sites with values 0 or 1 on a line, with each site evolving deterministically in discrete time steps according to p definite rules involving the values of its nearest neighbors. With simple initial configurations, the cellular automata either tend to homogeneous states, or generate self-similar patterns with fractal dimensions approx. =1.59 or approx. =1.69. With ''random'' initial configurations, the irreversible character of the cellular automaton evolution leads to several self-organization phenomena. Statistical properties of the structures generated are found to lie in two universality classes, independent of the details of the initial state or the cellular automaton rules. More complicated cellular automata are briefly considered, and connections with dynamical systems theory and the formal theory of computation are discussed

  6. Kansas's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  7. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  8. Performance demonstration tests for eddy current inspection of steam generator tubing

    International Nuclear Information System (INIS)

    Kurtz, R.J.; Heasler, P.G.; Anderson, C.M.

    1996-05-01

    This report describes the methodology and results for development of performance demonstration tests for eddy current (ET) inspection of steam generator tubes. Statistical test design principles were used to develop the performance demonstration tests. Thresholds on ET system inspection performance were selected to ensure that field inspection systems would have a high probability of detecting and and correctly sizing tube degradation. The technical basis for the ET system performance thresholds is presented in detail. Statistical test design calculations for probability of detection and flaw sizing tests are described. A recommended performance demonstration test based on the design calculations is presented. A computer program for grading the probability of detection portion of the performance demonstration test is given

  9. Performance demonstration tests for eddy current inspection of steam generator tubing

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, R.J.; Heasler, P.G.; Anderson, C.M.

    1996-05-01

    This report describes the methodology and results for development of performance demonstration tests for eddy current (ET) inspection of steam generator tubes. Statistical test design principles were used to develop the performance demonstration tests. Thresholds on ET system inspection performance were selected to ensure that field inspection systems would have a high probability of detecting and and correctly sizing tube degradation. The technical basis for the ET system performance thresholds is presented in detail. Statistical test design calculations for probability of detection and flaw sizing tests are described. A recommended performance demonstration test based on the design calculations is presented. A computer program for grading the probability of detection portion of the performance demonstration test is given.

  10. Statistical equilibrium equations for trace elements in stellar atmospheres

    OpenAIRE

    Kubat, Jiri

    2010-01-01

    The conditions of thermodynamic equilibrium, local thermodynamic equilibrium, and statistical equilibrium are discussed in detail. The equations of statistical equilibrium and the supplementary equations are shown together with the expressions for radiative and collisional rates with the emphasize on the solution for trace elements.

  11. Simple statistical methods for software engineering data and patterns

    CERN Document Server

    Pandian, C Ravindranath

    2015-01-01

    Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov

  12. Changes in Math Prerequisites and Student Performance in Business Statistics: Do Math Prerequisites Really Matter?

    OpenAIRE

    Jeffrey J. Green; Courtenay C. Stone; Abera Zegeye; Thomas A. Charles

    2007-01-01

    We use a binary probit model to assess the impact of several changes in math prerequisites on student performance in an undergraduate business statistics course. While the initial prerequisites did not necessarily provide students with the necessary math skills, our study, the first to examine the effect of math prerequisite changes, shows that these changes were deleterious to student performance. Our results helped convince the College of Business to change the math prerequisite again begin...

  13. A laboratory evaluation of the influence of weighing gauges performance on extreme events statistics

    Science.gov (United States)

    Colli, Matteo; Lanza, Luca

    2014-05-01

    The effects of inaccurate ground based rainfall measurements on the information derived from rain records is yet not much documented in the literature. La Barbera et al. (2002) investigated the propagation of the systematic mechanic errors of tipping bucket type rain gauges (TBR) into the most common statistics of rainfall extremes, e.g. in the assessment of the return period T (or the related non-exceedance probability) of short-duration/high intensity events. Colli et al. (2012) and Lanza et al. (2012) extended the analysis to a 22-years long precipitation data set obtained from a virtual weighing type gauge (WG). The artificial WG time series was obtained basing on real precipitation data measured at the meteo-station of the University of Genova and modelling the weighing gauge output as a linear dynamic system. This approximation was previously validated with dedicated laboratory experiments and is based on the evidence that the accuracy of WG measurements under real world/time varying rainfall conditions is mainly affected by the dynamic response of the gauge (as revealed during the last WMO Field Intercomparison of Rainfall Intensity Gauges). The investigation is now completed by analyzing actual measurements performed by two common weighing gauges, the OTT Pluvio2 load-cell gauge and the GEONOR T-200 vibrating-wire gauge, since both these instruments demonstrated very good performance under previous constant flow rate calibration efforts. A laboratory dynamic rainfall generation system has been arranged and validated in order to simulate a number of precipitation events with variable reference intensities. Such artificial events were generated basing on real world rainfall intensity (RI) records obtained from the meteo-station of the University of Genova so that the statistical structure of the time series is preserved. The influence of the WG RI measurements accuracy on the associated extreme events statistics is analyzed by comparing the original intensity

  14. A detailed statistical representation of the local structure of optical vortices in random wavefields

    International Nuclear Information System (INIS)

    Lindgren, Georg

    2012-01-01

    The statistical properties near phase singularities in a complex wavefield are here studied by means of the conditional distributions of the real and imaginary Gaussian components, given a common zero crossing point. The exact distribution is expressed as a Slepian model, where a regression term provides the main structure, with parameters given by the gradients of the Gaussian components at the singularity, and Gaussian non-stationary residuals that provide local variability. This technique differs from the linearization (Taylor expansion) technique commonly used. The empirically and theoretically verified elliptic eccentricity of the intensity contours in the vortex core is a property of the regression term, but with different normalization compared to the classical theory. The residual term models the statistical variability around these ellipses. The radii of the circular contours of the current magnitude are similarly modified by the new regression expansion and also here the random deviations are modeled by the residual field. (paper)

  15. Mini-Digest of Education Statistics, 2010. NCES 2011-016

    Science.gov (United States)

    Snyder, Thomas D.

    2011-01-01

    This pocket-sized compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mini-Digest" is designed as an easy reference for materials found in detail in the "Digest of Education Statistics". These volumes include selections of data from many…

  16. Mobile Digest of Education Statistics, 2013. NCES 2014-086

    Science.gov (United States)

    Snyder, Thomas D.

    2014-01-01

    This is the first edition of the "Mobile Digest of Education Statistics." This compact compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mobile Digest" is designed as an easy mobile reference for materials found in detail in the…

  17. Mini-Digest of Education Statistics, 2009. NCES 2010-014

    Science.gov (United States)

    Snyder, Thomas D.

    2010-01-01

    This compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mini-Digest" is designed as an easy reference for materials found in detail in the "Digest of Education Statistics, 2009". These volumes include selections of data from many…

  18. Line radiative transfer and statistical equilibrium

    NARCIS (Netherlands)

    Kamp, Inga

    Atomic and molecular line emission from protoplanetary disks contains key information of their detailed physical and chemical structures. To unravel those structures, we need to understand line radiative transfer in dusty media and the statistical equilibrium, especially of molecules. I describe

  19. Advances in Statistical Methods for Substance Abuse Prevention Research

    Science.gov (United States)

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  20. Modeling bidding competitiveness and position performance in multi-attribute construction auctions

    Directory of Open Access Journals (Sweden)

    Pablo Ballesteros-Pérez

    2015-12-01

    This paper details a practical methodology based on simple statistical calculations for modeling the performance of a single bidder or a group of bidders, constituting a useful resource for analyzing one’s own success while benchmarking potential bidding competitors.

  1. Generic Containment: Detailed comparison of containment simulations performed on plant scale

    International Nuclear Information System (INIS)

    Kelm, St.; Klauck, M.; Beck, S.; Allelein, H.-J.; Preusser, G.; Sangiorgi, M.; Klein-Hessling, W.; Bakalov, I.; Bleyer, A.; Bentaib, A.; Kljenak, I.; Stempniewicz, M.; Kostka, P.; Morandi, S.; Ada del Corno, B.; Bratfisch, C.; Risken, T.; Denk, L.; Parduba, Z.; Paci, S.

    2014-01-01

    Highlights: • Consequent implementation of the recommendations derived from the OECD/NEA ISP-47. • Phenomenological code-to-code comparison performed on plant scale. • Systematic identification and elimination of the user effect. • Identification of fundamental differences in the model basis. • Application to PAR system analysis. - Abstract: One outcome of the OECD/NEA ISP-47 activity was the recommendation to elaborate a ‘Generic Containment’ in order to allow comparing and rating the results obtained by different lumped-parameter models on plant scale. Within the European SARNET2 project ( (http://www.sar-net.eu)), such a Generic Containment nodalisation, based on a German PWR (1300 MW el ), was defined. This agreement on the nodalisation allows investigating the remaining differences among the results, especially the ‘user-effect’, related to the modelling choices, as well as fundamental differences in the underlying model basis in detail. The methodology applied in order to compare the different code predictions consisted of a series of three benchmark steps with increasing complexity as well as a systematic comparison of characteristic variables and observations. This paper summarises the benchmark series, the lessons learned during specifying the steps, comparing and discussing the results and finally gives an outlook on future steps

  2. Statistical Learning Theory: Models, Concepts, and Results

    OpenAIRE

    von Luxburg, Ulrike; Schoelkopf, Bernhard

    2008-01-01

    Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.

  3. Details from the Dashboard: Charter Schools by Geographic Region

    Science.gov (United States)

    National Alliance for Public Charter Schools, 2012

    2012-01-01

    While a majority of charter schools nationwide operate in urban and suburban areas, charter schools exist in all corners of the nation, and are expanding into all types of communities. This "Details from the Dashboard" report presents statistics on the number of charter schools and students enrolled in charter schools by the four geographic…

  4. Statistical intervals a guide for practitioners

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    Presents a detailed exposition of statistical intervals and emphasizes applications in industry. The discussion differentiates at an elementary level among different kinds of statistical intervals and gives instruction with numerous examples and simple math on how to construct such intervals from sample data. This includes confidence intervals to contain a population percentile, confidence intervals on probability of meeting specified threshold value, and prediction intervals to include observation in a future sample. Also has an appendix containing computer subroutines for nonparametric stati

  5. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  6. Theoretical physics 8 statistical physics

    CERN Document Server

    Nolting, Wolfgang

    2018-01-01

    This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...

  7. A statistical study of the performance of the Hakamada-Akasofu-Fry version 2 numerical model in predicting solar shock arrival times at Earth during different phases of solar cycle 23

    Energy Technology Data Exchange (ETDEWEB)

    McKenna-Lawlor, S.M.P. [National Univ. of Ireland, Maynooth, Co. Kildare (Ireland). Space Technology Ireland; Fry, C.D. [Exploration Physics International, Inc., Huntsville, AL (United States); Dryer, M. [Exploration Physics International, Inc., Huntsville, AL (United States); NOAA Space Environment Center, Boulder, CO (United States); Heynderickx, D. [D-H Consultancy, Leuven (Belgium); Kecskemety, K. [KFKI Research Institute for Particle and Nuclear Physics, Budapest (Hungary); Kudela, K. [Institute of Experimental Physics, Kosice (Slovakia); Balaz, J. [National Univ. of Ireland, Maynooth, Co. Kildare (Ireland). Space Technology Ireland; Institute of Experimental Physics, Kosice (Slovakia)

    2012-07-01

    statistical results obtained through detailed analysis of the available data provided insights into how changing circumstances on the Sun and in interplanetary space can affect the performance of the model. Since shock arrival predictions are widely utilized in making commercially significant decisions re. protecting space assets, the present detailed archival studies can be useful in future operational decision making during solar cycle 24. It would be of added value in this context to use Briggs-Rupert methodology to estimate the cost to an operator of acting on an incorrect forecast. (orig.)

  8. A statistical study of the performance of the Hakamada-Akasofu-Fry version 2 numerical model in predicting solar shock arrival times at Earth during different phases of solar cycle 23

    Directory of Open Access Journals (Sweden)

    S. M. P. McKenna-Lawlor

    2012-02-01

    Full Text Available The performance of the Hakamada Akasofu-Fry, version 2 (HAFv.2 numerical model, which provides predictions of solar shock arrival times at Earth, was subjected to a statistical study to investigate those solar/interplanetary circumstances under which the model performed well/poorly during key phases (rise/maximum/decay of solar cycle 23. In addition to analyzing elements of the overall data set (584 selected events associated with particular cycle phases, subsets were formed such that those events making up a particular sub-set showed common characteristics. The statistical significance of the results obtained using the various sets/subsets was generally very low and these results were not significant as compared with the hit by chance rate (50%. This implies a low level of confidence in the predictions of the model with no compelling result encouraging its use. However, the data suggested that the success rates of HAFv.2 were higher when the background solar wind speed at the time of shock initiation was relatively fast. Thus, in scenarios where the background solar wind speed is elevated and the calculated success rate significantly exceeds the rate by chance, the forecasts could provide potential value to the customer. With the composite statistics available for solar cycle 23, the calculated success rate at high solar wind speed, although clearly above 50%, was indicative rather than conclusive. The RMS error estimated for shock arrival times for every cycle phase and for the composite sample was in each case significantly better than would be expected for a random data set. Also, the parameter "Probability of Detection, yes" (PODy which presents the Proportion of Yes observations that were correctly forecast (i.e. the ratio between the shocks correctly predicted and all the shocks observed, yielded values for the rise/maximum/decay phases of the cycle and using the composite sample of 0.85, 0.64, 0.79 and 0.77, respectively. The statistical

  9. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  10. Mini-Digest of Education Statistics, 2008. NCES 2009-021

    Science.gov (United States)

    Snyder, Thomas D.

    2009-01-01

    This publication is the 14th edition of the "Mini-Digest of Education Statistics," a pocket-sized compilation of statistical information covering the broad field of American education from kindergarten through graduate school. The "Mini-Digest" is designed as an easy reference for materials found in much greater detail in the…

  11. Data and statistical methods for analysis of trends and patterns

    International Nuclear Information System (INIS)

    Atwood, C.L.; Gentillon, C.D.; Wilson, G.E.

    1992-11-01

    This report summarizes topics considered at a working meeting on data and statistical methods for analysis of trends and patterns in US commercial nuclear power plants. This meeting was sponsored by the Office of Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC). Three data sets are briefly described: Nuclear Plant Reliability Data System (NPRDS), Licensee Event Report (LER) data, and Performance Indicator data. Two types of study are emphasized: screening studies, to see if any trends or patterns appear to be present; and detailed studies, which are more concerned with checking the analysis assumptions, modeling any patterns that are present, and searching for causes. A prescription is given for a screening study, and ideas are suggested for a detailed study, when the data take of any of three forms: counts of events per time, counts of events per demand, and non-event data

  12. Devil in the details? Developmental dyslexia and visual long-term memory for details.

    Science.gov (United States)

    Huestegge, Lynn; Rohrßen, Julia; van Ermingen-Marbach, Muna; Pape-Neumann, Julia; Heim, Stefan

    2014-01-01

    Cognitive theories on causes of developmental dyslexia can be divided into language-specific and general accounts. While the former assume that words are special in that associated processing problems are rooted in language-related cognition (e.g., phonology) deficits, the latter propose that dyslexia is rather rooted in a general impairment of cognitive (e.g., visual and/or auditory) processing streams. In the present study, we examined to what extent dyslexia (typically characterized by poor orthographic representations) may be associated with a general deficit in visual long-term memory (LTM) for details. We compared object- and detail-related visual LTM performance (and phonological skills) between dyslexic primary school children and IQ-, age-, and gender-matched controls. The results revealed that while the overall amount of LTM errors was comparable between groups, dyslexic children exhibited a greater portion of detail-related errors. The results suggest that not only phonological, but also general visual resolution deficits in LTM may play an important role in developmental dyslexia.

  13. Development of 4S and related technologies. (3) Statistical evaluation of safety performance of 4S on ULOF event

    International Nuclear Information System (INIS)

    Ishii, Kyoko; Matsumiya, Hisato; Horie, Hideki; Miyagi, Kazumi

    2009-01-01

    The purpose of this work is to evaluate quantitatively and statistically the safety performance of Super-Safe, Small, and Simple reactor (4S) by analyzing with ARGO code, a plant dynamics code for a sodium-cooled fast reactor. In this evaluation, an Anticipated Transient Without Scram (ATWS) is assumed, and an Unprotected Loss of Flow (ULOF) event is selected as a typical ATWS case. After a metric concerned with safety design is defined as performance factor a Phenomena Identification Ranking Table (PIRT) is produced in order to select the plausible phenomena that affect the metric. Then a sensitivity analysis is performed for the parameters related to the selected plausible phenomena. Finally the metric is evaluated with statistical methods whether it satisfies the given safety acceptance criteria. The result is as follows: The Cumulative Damage Fraction (CDF) for the cladding is defined as a metric, and the statistical estimation of the one-sided upper tolerance limit of 95 percent probability at a 95 percent confidence level in CDF is within the safety acceptance criterion; CDF < 0.1. The result shows that the 4S safety performance is acceptable in the ULOF event. (author)

  14. Statistical parameter characteristics of gas-phase fluctuations for gas-liquid intermittent flow

    Energy Technology Data Exchange (ETDEWEB)

    Matsui, G.; Monji, H.; Takaguchi, M. [Univ. of Tsukuba (Japan)

    1995-09-01

    This study deals with theoretical analysis on the general behaviour of statistical parameters of gas-phase fluctuations and comparison of statistical parameter characteristics for the real void fraction fluctuations measured with those for the wave form modified the real fluctuations. In order to investigate the details of the relation between the behavior of the statistical parameters in real intermittent flow and analytical results obtained from information on the real flow, the distributions of statistical parameters for general fundamental wave form of gas-phase fluctuations are discussed in detail. By modifying the real gas-phase fluctuations to a trapezoidaly wave, the experimental results can be directly compared with the analytical results. The analytical results for intermittent flow show that the wave form parameter, and the total amplitude of void fraction fluctuations, affects strongly on the statistical parameter characteristics. The comparison with experiment using nitrogen gas-water intermittent flow suggests that the parameters of skewness and excess may be better as indicators of flow pattern. That is, the macroscopic nature of intermittent flow can be grasped by the skewness and the excess, and the detailed flow structure may be described by the mean and the standard deviation.

  15. Statistical parameter characteristics of gas-phase fluctuations for gas-liquid intermittent flow

    International Nuclear Information System (INIS)

    Matsui, G.; Monji, H.; Takaguchi, M.

    1995-01-01

    This study deals with theoretical analysis on the general behaviour of statistical parameters of gas-phase fluctuations and comparison of statistical parameter characteristics for the real void fraction fluctuations measured with those for the wave form modified the real fluctuations. In order to investigate the details of the relation between the behavior of the statistical parameters in real intermittent flow and analytical results obtained from information on the real flow, the distributions of statistical parameters for general fundamental wave form of gas-phase fluctuations are discussed in detail. By modifying the real gas-phase fluctuations to a trapezoidaly wave, the experimental results can be directly compared with the analytical results. The analytical results for intermittent flow show that the wave form parameter, and the total amplitude of void fraction fluctuations, affects strongly on the statistical parameter characteristics. The comparison with experiment using nitrogen gas-water intermittent flow suggests that the parameters of skewness and excess may be better as indicators of flow pattern. That is, the macroscopic nature of intermittent flow can be grasped by the skewness and the excess, and the detailed flow structure may be described by the mean and the standard deviation

  16. Results of Detailed Hydrologic Characterization Tests - Fiscal Year 2000

    International Nuclear Information System (INIS)

    Spane, Frank A; Thorne, Paul D; Newcomer, Darrell R

    2001-01-01

    This report provides the results of detailed hydrologic characterization tests conducted within eleven Hanford Site wells during fiscal year 2000. Detailed characterization tests performed included groundwater-flow characterization; barometric response evaluation; slug tests; single-well tracer tests; constant-rate pumping tests; and in-well, vertical flow tests. Hydraulic property estimates obtained from the detailed hydrologic tests include transmissivity; hydraulic conductivity; specific yield; effective porosity; in-well, lateral flow velocity; aquifer-flow velocity; vertical distribution of hydraulic conductivity (within the well-screen section); and in-well, vertical flow velocity. In addition, local groundwater-flow characteristics (i.e., hydraulic gradient and flow direction) were determined for four sites where detailed well testing was performed

  17. The yield of high-detail radiographic skeletal surveys in suspected infant abuse

    International Nuclear Information System (INIS)

    Barber, Ignasi; Perez-Rossello, Jeannette M.; Kleinman, Paul K.; Wilson, Celeste R.

    2015-01-01

    Skeletal surveys are routinely performed in cases of suspected child abuse, but there are limited data regarding the yield of high-detail skeletal surveys in infants. To determine the diagnostic yield of high-detail radiographic skeletal surveys in suspected infant abuse. We reviewed the high-detail American College of Radiology standardized skeletal surveys performed for suspected abuse in 567 infants (median: 4.4 months, SD 3.47; range: 4 days-12 months) at a large urban children's hospital between 2005 and 2013. Skeletal survey images, radiology reports and medical records were reviewed. A skeletal survey was considered positive when it showed at least one unsuspected fracture. In 313 of 567 infants (55%), 1,029 definite fractures were found. Twenty-one percent (119/567) of the patients had a positive skeletal survey with a total of 789 (77%) unsuspected fractures. Long-bone fractures were the most common injuries, present in 145 children (26%). The skull was the site of fracture in 138 infants (24%); rib cage in 77 (14%), clavicle in 24 (4.2%) and uncommon fractures (including spine, scapula, hands and feet and pelvis) were noted in 26 infants (4.6%). Of the 425 infants with neuroimaging, 154 (36%) had intracranial injury. No significant correlation between positive skeletal survey and associated intracranial injury was found. Scapular fractures and complex skull fractures showed a statistically significant correlation with intracranial injury (P = 0.029, P = 0.007, respectively). Previously unsuspected fractures are noted on skeletal surveys in 20% of cases of suspected infant abuse. These data may be helpful in the design and optimization of global skeletal imaging in this vulnerable population. (orig.)

  18. The yield of high-detail radiographic skeletal surveys in suspected infant abuse

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Ignasi [Hospital Vall d' Hebron, Universitat Autonoma de Barcelona, Pediatric Radiology Department, Barcelona (Spain); Perez-Rossello, Jeannette M.; Kleinman, Paul K. [Boston Children' s Hospital, Radiology Department, Boston, MA (United States); Wilson, Celeste R. [Boston Children' s Hospital, Division of General Pediatrics, Boston, MA (United States)

    2014-07-06

    Skeletal surveys are routinely performed in cases of suspected child abuse, but there are limited data regarding the yield of high-detail skeletal surveys in infants. To determine the diagnostic yield of high-detail radiographic skeletal surveys in suspected infant abuse. We reviewed the high-detail American College of Radiology standardized skeletal surveys performed for suspected abuse in 567 infants (median: 4.4 months, SD 3.47; range: 4 days-12 months) at a large urban children's hospital between 2005 and 2013. Skeletal survey images, radiology reports and medical records were reviewed. A skeletal survey was considered positive when it showed at least one unsuspected fracture. In 313 of 567 infants (55%), 1,029 definite fractures were found. Twenty-one percent (119/567) of the patients had a positive skeletal survey with a total of 789 (77%) unsuspected fractures. Long-bone fractures were the most common injuries, present in 145 children (26%). The skull was the site of fracture in 138 infants (24%); rib cage in 77 (14%), clavicle in 24 (4.2%) and uncommon fractures (including spine, scapula, hands and feet and pelvis) were noted in 26 infants (4.6%). Of the 425 infants with neuroimaging, 154 (36%) had intracranial injury. No significant correlation between positive skeletal survey and associated intracranial injury was found. Scapular fractures and complex skull fractures showed a statistically significant correlation with intracranial injury (P = 0.029, P = 0.007, respectively). Previously unsuspected fractures are noted on skeletal surveys in 20% of cases of suspected infant abuse. These data may be helpful in the design and optimization of global skeletal imaging in this vulnerable population. (orig.)

  19. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  20. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  1. Analysis of relationship between registration performance of point cloud statistical model and generation method of corresponding points

    International Nuclear Information System (INIS)

    Yamaoka, Naoto; Watanabe, Wataru; Hontani, Hidekata

    2010-01-01

    Most of the time when we construct statistical point cloud model, we need to calculate the corresponding points. Constructed statistical model will not be the same if we use different types of method to calculate the corresponding points. This article proposes the effect to statistical model of human organ made by different types of method to calculate the corresponding points. We validated the performance of statistical model by registering a surface of an organ in a 3D medical image. We compare two methods to calculate corresponding points. The first, the 'Generalized Multi-Dimensional Scaling (GMDS)', determines the corresponding points by the shapes of two curved surfaces. The second approach, the 'Entropy-based Particle system', chooses corresponding points by calculating a number of curved surfaces statistically. By these methods we construct the statistical models and using these models we conducted registration with the medical image. For the estimation, we use non-parametric belief propagation and this method estimates not only the position of the organ but also the probability density of the organ position. We evaluate how the two different types of method that calculates corresponding points affects the statistical model by change in probability density of each points. (author)

  2. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  3. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  4. Energy Statistics Manual [Arabic version

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  5. Energy Statistics Manual; Handbuch Energiestatistik

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  6. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  7. Tube problems: worldwide statistics reviewed

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    EPRI's Steam Generator Strategic Management Project issues an annual report on the progress being made in tackling steam generator problems worldwide, containing a wealth of detailed statistics on the status of operating units and degradation mechanisms encountered. A few highlights are presented from the latest report, issued in October 1993, which covers the period to 31 December 1992. (Author)

  8. An exposure-response database for detailed toxicity data

    International Nuclear Information System (INIS)

    Woodall, George M.

    2008-01-01

    Risk assessment for human health effects often depends on evaluation of toxicological literature from a variety of sources. Risk assessors have limited resources for obtaining raw data, performing follow-on analyses or initiating new studies. These constraints must be balanced against a need to improve scientific credibility through improved statistical and analytical methods that optimize the use of available information. Computerized databases are used in toxicological risk assessment both for storing data and performing predictive analyses. Many systems provide primarily either bibliographic information or summary factual data from toxicological studies; few provide adequate information to allow application of dose-response models. The Exposure-Response database (ERDB) described here fills this gap by allowing entry of sufficiently detailed information on experimental design and results for each study, while limiting data entry to the most relevant. ERDB was designed to contain information from the open literature to support dose-response assessment and allow a high level of automation in performance of various types of dose-response analyses. Specifically, ERDB supports emerging analytical approaches for dose-response assessment, while accommodating the diverse nature of published literature. Exposure and response data are accessible in a relational multi-table design, with closely controlled standard fields for recording values and free-text fields to describe unique aspects of the study. Additional comparative analyses are made possible through summary tables and graphic representations of the data contained within ERDB

  9. Influences on physicians' adoption of electronic detailing (e-detailing).

    Science.gov (United States)

    Alkhateeb, Fadi M; Doucette, William R

    2009-01-01

    E-detailing means using digital technology: internet, video conferencing and interactive voice response. There are two types of e-detailing: interactive (virtual) and video. Currently, little is known about what factors influence physicians' adoption of e-detailing. The objectives of this study were to test a model of physicians' adoption of e-detailing and to describe physicians using e-detailing. A mail survey was sent to a random sample of 2000 physicians practicing in Iowa. Binomial logistic regression was used to test the model of influences on physician adoption of e-detailing. On the basis of Rogers' model of adoption, the independent variables included relative advantage, compatibility, complexity, peer influence, attitudes, years in practice, presence of restrictive access to traditional detailing, type of specialty, academic affiliation, type of practice setting and control variables. A total of 671 responses were received giving a response rate of 34.7%. A total of 141 physicians (21.0%) reported using of e-detailing. The overall adoption model for using either type of e-detailing was found to be significant. Relative advantage, peer influence, attitudes, type of specialty, presence of restrictive access and years of practice had significant influences on physician adoption of e-detailing. The model of adoption of innovation is useful to explain physicians' adoption of e-detailing.

  10. Black coal in Australia 1983-84: a statistical year book

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    An annual publication containing comprehensive statistical details of the Australian black coal industry. Included are statistics on coal supply and disposal, production, plant and equipment, coal preparation, manpower, exports, coal consumption, resources. Maps are included, also tables showing supply and disposal, production figures, employees, exports, consumption etc.

  11. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. Detailed modeling of the statistical uncertainty of Thomson scattering measurements

    International Nuclear Information System (INIS)

    Morton, L A; Parke, E; Hartog, D J Den

    2013-01-01

    The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined

  13. Performance evaluation of a digital mammography unit using a contrast-detail phantom

    Science.gov (United States)

    Elizalde-Cabrera, J.; Brandan, M.-E.

    2015-01-01

    The relation between image quality and mean glandular dose (MGD) has been studied for a Senographe 2000D mammographic unit used for research in our laboratory. The magnitudes were evaluated for a clinically relevant range of acrylic thicknesses and radiological techniques. The CDMAM phantom was used to determine the contrast-detail curve. Also, an alternative method based on the analysis of signal-to-noise (SNR) and contrast-to-noise (CNR) ratios from the CDMAM image was proposed and applied. A simple numerical model was utilized to successfully interpret the results. Optimum radiological techniques were determined using the figures-of-merit FOMSNR=SNR2/MGD and FOMCNR=CNR2/MGD. Main results were: the evaluation of the detector response flattening process (it reduces by about one half the spatial non-homogeneities due to the X- ray field), MGD measurements (the values comply with standards), and verification of the automatic exposure control performance (it is sensitive to fluence attenuation, not to contrast). For 4-5 cm phantom thicknesses, the optimum radiological techniques were Rh/Rh 34 kV to optimize SNR, and Rh/Rh 28 kV to optimize CNR.

  14. Detailed performance analysis of realistic solar photovoltaic systems at extensive climatic conditions

    International Nuclear Information System (INIS)

    Gupta, Ankit; Chauhan, Yogesh K.

    2016-01-01

    In recent years, solar energy has been considered as one of the principle renewable energy source for electric power generation. In this paper, single diode photovoltaic (PV) system and double/bypass diode based PV system are designed in MATLAB/Simulink environment based on their mathematical modeling and are validated with a commercially available solar panel. The novelty of the paper is to include the effect of climatic conditions i.e. variable irradiation level, wind speed, temperature, humidity level and dust accumulation in the modeling of both the PV systems to represent a realistic PV system. The comprehensive investigations are made on both the modeled PV systems. The obtained results show the satisfactory performance for realistic models of the PV system. Furthermore, an in depth comparative analysis is carried out for both PV systems. - Highlights: • Modeling of Single diode and Double diode PV systems in MATLAB/Simulink software. • Validation of designed PV systems with a commercially available PV panel. • Acquisition and employment of key climatic factors in modeling of the PV systems. • Evaluation of main model parameters of both the PV systems. • Detailed comparative assessment of both the modeled PV system parameters.

  15. TRAN-STAT: statistics for environmental studies

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1984-09-01

    This issue of TRAN-STAT discusses statistical methods for assessing the uncertainty in predictions of pollutant transport models, particularly for radionuclides. Emphasis is placed on radionuclide transport models but the statistical assessment techniques also apply in general to other types of pollutants. The report begins with an outline of why an assessment of prediction uncertainties is important. This is followed by an introduction to several methods currently used in these assessments. This in turn is followed by more detailed discussion of the methods, including examples. 43 references, 2 figures

  16. Examples and problems in mathematical statistics

    CERN Document Server

    Zacks, Shelemyahu

    2013-01-01

    This book presents examples that illustrate the theory of mathematical statistics and details how to apply the methods for solving problems.  While other books on the topic contain problems and exercises, they do not focus on problem solving. This book fills an important niche in the statistical theory literature by providing a theory/example/problem approach.  Each chapter is divided into four parts: Part I provides the needed theory so readers can become familiar with the concepts, notations, and proven results; Part II presents examples from a variety of fields including engineering, mathem

  17. Foundation of statistical energy analysis in vibroacoustics

    CERN Document Server

    Le Bot, A

    2015-01-01

    This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.

  18. Use of demonstrations and experiments in teaching business statistics

    OpenAIRE

    Johnson, D. G.; John, J. A.

    2003-01-01

    The aim of a business statistics course should be to help students think statistically and to interpret and understand data, rather than to focus on mathematical detail and computation. To achieve this students must be thoroughly involved in the learning process, and encouraged to discover for themselves the meaning, importance and relevance of statistical concepts. In this paper we advocate the use of experiments and demonstrations as aids to achieving these goals. A number of demonstrations...

  19. Gentile statistics with a large maximum occupation number

    International Nuclear Information System (INIS)

    Dai Wusheng; Xie Mi

    2004-01-01

    In Gentile statistics the maximum occupation number can take on unrestricted integers: 1 1 the Bose-Einstein case is not recovered from Gentile statistics as n goes to N. Attention is also concentrated on the contribution of the ground state which was ignored in related literature. The thermodynamic behavior of a ν-dimensional Gentile ideal gas of particle of dispersion E=p s /2m, where ν and s are arbitrary, is analyzed in detail. Moreover, we provide an alternative derivation of the partition function for Gentile statistics

  20. Using Facebook Data to Turn Introductory Statistics Students into Consultants

    Science.gov (United States)

    Childers, Adam F.

    2017-01-01

    Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…

  1. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1990-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1990 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applied to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for futher illustration of the performance. (au)

  2. Devil in the Details? Developmental Dyslexia and Visual Long-Term Memory for Details

    Directory of Open Access Journals (Sweden)

    Lynn eHuestegge

    2014-07-01

    Full Text Available Cognitive theories on causes of developmental dyslexia can be divided into language-specific and general accounts. While the former assume that words are special in that associated processing problems are rooted in language-related cognition (e.g., phonology deficits, the latter propose that dyslexia is rather rooted in a general impairment of cognitive (e.g., visual and/or auditory processing streams. In the present study, we examined to what extent dyslexia (typically characterized by poor orthographic representations may be associated with a general deficit in visual long-term memory for details. We compared object- and detail-related visual long-term memory performance (and phonological skills between dyslexic primary school children and IQ-, age- and gender-matched controls. The results revealed that while the overall amount of long-term memory errors was comparable between groups, dyslexic children exhibited a greater portion of detail-related errors. The results suggest that not only phonological, but also general visual resolution deficits in long-term memory may play an important role in developmental dyslexia.

  3. Game Indicators Determining Sports Performance in the NBA

    OpenAIRE

    Miko?ajec, Kazimierz; Maszczyk, Adam; Zaj?c, Tomasz

    2013-01-01

    The main goal of the present study was to identify basketball game performance indicators which best determine sports level in the National Basketball Association (NBA) league. The research material consisted of all NBA game statistics at the turn of eight seasons (2003?11) and included 52 performance variables. Through detailed analysis the variables with high influence on game effectiveness were selected for final procedures. It has been proven that a limited number of factors, mostly offen...

  4. Components of the Pearson-Fisher chi-squared statistic

    Directory of Open Access Journals (Sweden)

    G. D. Raynery

    2002-01-01

    interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.

  5. Order, disorder and generalized statistics

    International Nuclear Information System (INIS)

    Marino, E.C.; Swieca, J.A.

    1980-06-01

    We generalize the prescription of Kadanoff and Ceva for the computation of disorder variables correlation functions in the Ising model for continuous field theories with U(1) symmetry. By considering the product of order and disorder variables, we obtain a path integral representation for fields with generalized statistics. We discuss in detail the cases of massless Thirring and Schwinger models. (Author) [pt

  6. Order, disorder and generalized statistics

    International Nuclear Information System (INIS)

    Marino, E.C.; Swieca, J.A.; Pontificia Universidade Catolica do Rio de Janeiro

    1980-01-01

    We generalize the prescription of Kadanoff and Ceva for the computation of disorder variable correlation functions in the Ising model for continuous field theories with U(1) symmetry. By considering the product of order and disorder variables, we obtain a path integral representation for fields with generalized statistics. We discuss in detail the cases of massless Thirring and Schwinger models. (orig.)

  7. Evaluating statistical cloud schemes

    OpenAIRE

    Grützun, Verena; Quaas, Johannes; Morcrette , Cyril J.; Ament, Felix

    2015-01-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based re...

  8. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  9. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  10. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    Science.gov (United States)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  11. Analysing sensory panel performance in a proficiency test using the PanelCheck software

    DEFF Research Database (Denmark)

    Tomic, O.; Luciano, G.; Nilsen, A.

    2010-01-01

    Check software, a workflow is proposed that guides the user through the data analysis process. This allows practitioners and non-statisticians to get an overview over panel performances in a rapid manner without the need to be familiar with details on the statistical methods. Visualisation of data analysis...... results plays an important role as this provides a time saving and efficient way of screening and investigating sensory panel performances. Most of the statistical methods used in this paper are available in the open source software PanelCheck, which may be downloaded and used for free....

  12. Statistical estimation of nuclear reactor dynamic parameters

    International Nuclear Information System (INIS)

    Cummins, J.D.

    1962-02-01

    This report discusses the study of the noise in nuclear reactors and associated power plant. The report is divided into three distinct parts. In the first part parameters which influence the dynamic behaviour of some reactors will be specified and their effect on dynamic performance described. Methods of estimating dynamic parameters using statistical signals will be described in detail together with descriptions of the usefulness of the results, the accuracy and related topics. Some experiments which have been and which might be performed on nuclear reactors will be described. In the second part of the report a digital computer programme will be described. The computer programme derives the correlation functions and the spectra of signals. The programme will compute the frequency response both gain and phase for physical items of plant for which simultaneous recordings of input and output signal variations have been made. Estimations of the accuracy of the correlation functions and the spectra may be computed using the programme and the amplitude distribution of signals may also b computed. The programme is written in autocode for the Ferranti Mercury computer. In the third part of the report a practical example of the use of the method and the digital programme is presented. In order to eliminate difficulties of interpretation a very simple plant model was chosen i.e. a simple first order lag. Several interesting properties of statistical signals were measured and will be discussed. (author)

  13. Mathematical statistics essays on history and methodology

    CERN Document Server

    Pfanzagl, Johann

    2017-01-01

    This book presents a detailed description of the development of statistical theory. In the mid twentieth century, the development of mathematical statistics underwent an enduring change, due to the advent of more refined mathematical tools. New concepts like sufficiency, superefficiency, adaptivity etc. motivated scholars to reflect upon the interpretation of mathematical concepts in terms of their real-world relevance. Questions concerning the optimality of estimators, for instance, had remained unanswered for decades, because a meaningful concept of optimality (based on the regularity of the estimators, the representation of their limit distribution and assertions about their concentration by means of Anderson’s Theorem) was not yet available. The rapidly developing asymptotic theory provided approximate answers to questions for which non-asymptotic theory had found no satisfying solutions. In four engaging essays, this book presents a detailed description of how the use of mathematical methods stimulated...

  14. Exactly soluble problems in statistical mechanics

    International Nuclear Information System (INIS)

    Yang, C.N.

    1983-01-01

    In the last few years, a number of two-dimensional classical and one-dimensional quantum mechanical problems in statistical mechanics have been exactly solved. Although these problems range over models of diverse physical interest, their solutions were obtained using very similar mathematical methods. In these lectures, the main points of the methods are discussed. In this introductory lecture, an overall survey of all these problems without going into the detailed method of solution is given. In later lectures, they shall concentrate on one particular problem: the delta function interaction in one dimension, and go into the details of that problem

  15. Optimization of Biodiesel-Diesel Blended Fuel Properties and Engine Performance with Ether Additive Using Statistical Analysis and Response Surface Methods

    Directory of Open Access Journals (Sweden)

    Obed M. Ali

    2015-12-01

    Full Text Available In this study, the fuel properties and engine performance of blended palm biodiesel-diesel using diethyl ether as additive have been investigated. The properties of B30 blended palm biodiesel-diesel fuel were measured and analyzed statistically with the addition of 2%, 4%, 6% and 8% (by volume diethyl ether additive. The engine tests were conducted at increasing engine speeds from 1500 rpm to 3500 rpm and under constant load. Optimization of independent variables was performed using the desirability approach of the response surface methodology (RSM with the goal of minimizing emissions and maximizing performance parameters. The experiments were designed using a statistical tool known as design of experiments (DoE based on RSM.

  16. The detail is dead - long live the detail!

    DEFF Research Database (Denmark)

    Larsen, Steen Nepper; Dalgaard, Kim; Kerstens, Vencent

    2018-01-01

    architecture when we look into architectural history. Too classic examples are; Adolf Loos who provoked already in 1908 with his statement; "Ornament and Crime", which contested the unconscious decorations of contemporary architects. Similarly, referring to the little need for superfluous detailing; "Less...... not change the fact that it is more important than ever to bring this 'small' architectural world to attention. Today, the construction industry is dictated by an economic management that does not leave much room for thorough studies of architectural details or visionary experiments. Today's more efficient......_Delft about the Symposium; "The Detail is Dead - Long Live the Detail". For this occasion a number of leading Danish and Northern European architects, researchers and companies were invited to discuss and suggest their 'architectural detail' and the challenges they face in today's construction. This book...

  17. Energy statistics. France

    International Nuclear Information System (INIS)

    2002-10-01

    This document summarizes in a series of tables the energy statistical data for France: consumption since 1973; energy supplies (production, imports, exports, stocks) and uses (refining, power production, internal uses, sectoral consumption) for coal, petroleum, gas, electricity, and renewable energy sources; national production and consumption of primary energy; final consumption per sector and per energy source; general indicators (energy bill, US$ change rate, prices, energy independence, internal gross product); projections. Details (resources, uses, prices, imports, internal consumption) are given separately for petroleum, natural gas, electric power and solid mineral fuels. (J.S.)

  18. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    Directory of Open Access Journals (Sweden)

    Hamid Reza Marateb

    2014-01-01

    Full Text Available Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal-variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD. Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables.

  19. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    Science.gov (United States)

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  20. Data on the detail information of influence of substrate temperature on the film morphology and photovoltaic performance of non-fullerene organic solar cells.

    Science.gov (United States)

    Zhang, Jicheng; Xie, SuFei; Lu, Zhen; Wu, Yang; Xiao, Hongmei; Zhang, Xuejuan; Li, Guangwu; Li, Cuihong; Chen, Xuebo; Ma, Wei; Bo, Zhishan

    2017-10-01

    This data contains additional data related to the article "Influence of Substrate Temperature on the Film Morphology and Photovoltaic Performance of Non-fullerene Organic Solar Cells" (Jicheng Zhang et al., In press) [1]. Data include measurement and characterization instruments and condition, detail condition to fabricate norfullerene solar cell devices, hole-only and electron-only devices. Detail condition about how to control the film morphology of devices via tuning the temperature of substrates was also displayed. More information and more convincing data about the change of film morphology for active layers fabricated from different temperature, which is attached to the research article of "Influence of Substrate Temperature on the Film Morphology and Photovoltaic Performance of Non-fullerene Organic Solar Cells" was given.

  1. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  2. Upgrade to iterative image reconstruction (IR) in MDCT imaging: a clinical study for detailed parameter optimization beyond vendor recommendations using the adaptive statistical iterative reconstruction environment (ASIR) Part2: The chest.

    Science.gov (United States)

    Mueck, F G; Michael, L; Deak, Z; Scherr, M K; Maxien, D; Geyer, L L; Reiser, M; Wirth, S

    2013-07-01

    To compare the image quality in dose-reduced 64-row CT of the chest at different levels of adaptive statistical iterative reconstruction (ASIR) to full-dose baseline examinations reconstructed solely with filtered back projection (FBP) in a realistic upgrade scenario. A waiver of consent was granted by the institutional review board (IRB). The noise index (NI) relates to the standard deviation of Hounsfield units in a water phantom. Baseline exams of the chest (NI = 29; LightSpeed VCT XT, GE Healthcare) were intra-individually compared to follow-up studies on a CT with ASIR after system upgrade (NI = 45; Discovery HD750, GE Healthcare), n = 46. Images were calculated in slice and volume mode with ASIR levels of 0 - 100 % in the standard and lung kernel. Three radiologists independently compared the image quality to the corresponding full-dose baseline examinations (-2: diagnostically inferior, -1: inferior, 0: equal, + 1: superior, + 2: diagnostically superior). Statistical analysis used Wilcoxon's test, Mann-Whitney U test and the intraclass correlation coefficient (ICC). The mean CTDIvol decreased by 53 % from the FBP baseline to 8.0 ± 2.3 mGy for ASIR follow-ups; p ASIR 70 % in volume mode (-0.07 ± 0.29, p = 0.29). Concerning the lung kernel, every ASIR level outperformed the baseline image quality (p ASIR 30 % rated best (slice: 0.70 ± 0.6, volume: 0.74 ± 0.61). Vendors' recommendation of 50 % ASIR is fair. In detail, the ASIR 70 % in volume mode for the standard kernel and ASIR 30 % for the lung kernel performed best, allowing for a dose reduction of approximately 50 %. © Georg Thieme Verlag KG Stuttgart · New York.

  3. Academic Performance: An Approach From Data Mining

    Directory of Open Access Journals (Sweden)

    David L. La Red Martinez

    2012-02-01

    Full Text Available The relatively low% of students promoted and regularized in Operating Systems Course of the LSI (Bachelor’s Degree in Information Systems of FaCENA (Faculty of Sciences and Natural Surveying - Facultad de Ciencias Exactas, Naturales y Agrimensura of UNNE (academic success, prompted this work, whose objective is to determine the variables that affect the academic performance, whereas the final status of the student according to the Res. 185/03 CD (scheme for evaluation and promotion: promoted, regular or free1. The variables considered are: status of the student, educational level of parents, secondary education, socio-economic level, and others. Data warehouse (Data Warehouses: DW and data mining (Data Mining: DM techniques were used to search pro.les of students and determine success or failure academic potential situations. Classifications through techniques of clustering according to different criteria have become. Some criteria were the following: mining of classification according to academic program, according to final status of the student, according to importance given to the study, mining of demographic clustering and Kohonen clustering according to final status of the student. Were conducted statistics of partition, detail of partitions, details of clusters, detail of fields and frequency of fields, overall quality of each process and quality detailed (precision, classification, reliability, arrays of confusion, diagrams of gain / elevation, trees, distribution of nodes, of importance of fields, correspondence tables of fields and statistics of cluster. Once certain profiles of students with low academic performance, it may address actions aimed at avoiding potential academic failures. This work aims to provide a brief description of aspects related to the data warehouse built and some processes of data mining developed on the same.

  4. Benchmarking statistical averaging of spectra with HULLAC

    Science.gov (United States)

    Klapisch, Marcel; Busquet, Michel

    2008-11-01

    Knowledge of radiative properties of hot plasmas is important for ICF, astrophysics, etc When mid-Z or high-Z elements are present, the spectra are so complex that one commonly uses statistically averaged description of atomic systems [1]. In a recent experiment on Fe[2], performed under controlled conditions, high resolution transmission spectra were obtained. The new version of HULLAC [3] allows the use of the same model with different levels of details/averaging. We will take advantage of this feature to check the effect of averaging with comparison with experiment. [1] A Bar-Shalom, J Oreg, and M Klapisch, J. Quant. Spectros. Rad. Transf. 65, 43 (2000). [2] J. E. Bailey, G. A. Rochau, C. A. Iglesias et al., Phys. Rev. Lett. 99, 265002-4 (2007). [3]. M. Klapisch, M. Busquet, and A. Bar-Shalom, AIP Conference Proceedings 926, 206-15 (2007).

  5. Oil pipeline performance review 1995, 1996, 1997, 1998 : Technical/statistical report

    International Nuclear Information System (INIS)

    2000-12-01

    This document provides a summary of the pipeline performance and reportable pipeline failures of liquid hydrocarbon pipelines in Canada, for the years 1995 through 1998. The year 1994 was the last one for which the Oil Pipeline Performance Review (OPPR) was published on an annual basis. The OPPR will continue to be published until such time as the Pipeline Risk Assesment Sub-Committee (PRASC) has obtained enough pipeline failure data to be aggregated into a meaningful report. The shifts in the mix of reporting pipeline companies is apparent in the data presented, comparing the volumes transported and the traffic volume during the previous ten-year period. Another table presents a summary of the failures which occurred during the period under consideration, 1995-1998, allowing for a comparison with the data for the previous ten-year period. From the current perspective and from an historical context, this document provides a statistical review of the performance of the pipelines, covering refined petroleum product pipelines, clean oil pipelines and High Vapour Pressure (HVP) pipelines downstream of battery limits. Classified as reportable are spills of 1.5 cubic metre or more of liquid hydrocarbons, any amount of HVP material, any incident involving an injury, a death, a fire, or an explosion. For those companies that responded to the survey, the major items, including number of failures and volumes released are accurate. Samples of the forms used for collecting the information are provided within the document. 6 tabs., 1 fig

  6. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  7. Limit temperature for entanglement in generalized statistics

    International Nuclear Information System (INIS)

    Rossignoli, R.; Canosa, N.

    2004-01-01

    We discuss the main properties of general thermal states derived from non-additive entropic forms and their use for studying quantum entanglement. It is shown that all these states become more mixed as the temperature increases, approaching the full random state for T→∞. The formalism is then applied to examine the limit temperature for entanglement in a two-qubit XXZ Heisenberg chain, which exhibits the peculiar feature of being independent of the applied magnetic field in the conventional von Neumann based statistics. In contrast, this temperature is shown to be field dependent in a generalized statistics, even for small deviations from the standard form. Results for the Tsallis-based statistics are examined in detail

  8. Cellular automata and statistical mechanical models

    International Nuclear Information System (INIS)

    Rujan, P.

    1987-01-01

    The authors elaborate on the analogy between the transfer matrix of usual lattice models and the master equation describing the time development of cellular automata. Transient and stationary properties of probabilistic automata are linked to surface and bulk properties, respectively, of restricted statistical mechanical systems. It is demonstrated that methods of statistical physics can be successfully used to describe the dynamic and the stationary behavior of such automata. Some exact results are derived, including duality transformations, exact mappings, disorder, and linear solutions. Many examples are worked out in detail to demonstrate how to use statistical physics in order to construct cellular automata with desired properties. This approach is considered to be a first step toward the design of fully parallel, probabilistic systems whose computational abilities rely on the cooperative behavior of their components

  9. Quarterly coal statistics of OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-27

    These quarterly statistics contain data from the fourth quarter 1990 to the fourth quarter 1991. The first set of tables (A1 to A30) show trends in production, trade, stock change and apparent consumption data for OECD countries. Tables B1 to B12 show detailed statistics for some major coal trade flows to and from OECD countries and average value in US dollars. A third set of tables, C1 to C12, show average import values and indices. The trade data have been extracted or derived from national and EEC customs statistics. An introductory section summarizes trends in coal supply and consumption, deliveries to thermal power stations; electricity production and final consumption of coal and tabulates EEC and Japanese steam coal and coking coal imports to major countries.

  10. Energy Statistics Manual; Manual Statistik Energi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  11. Statistics of DNA Markers - RGP gmap | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RGP gmap Statistics of DNA Markers Data detail Data name Statistics of DNA Markers DOI 10.18...908/lsdba.nbdc00318-01-001 Description of data contents Statistics of DNA markers that were used to create t...iption Download License Update History of This Database Site Policy | Contact Us Statistics of DNA Markers - RGP gmap | LSDB Archive ...

  12. The development and evaluation of programmatic performance indicators associated with maintenance at nuclear power plants

    International Nuclear Information System (INIS)

    Wreathall, J.; Fragola, J.; Appignani, P.; Burlile, G.; Shen, Y.

    1990-05-01

    This report summarizes the development and evaluation of programmatic performance indicators of maintenance. These indicators were selected by: (1) creating a formal framework of plant processes; (2) identifying features of plant behavior considered important to safety; (3) evaluating existing indicators against these features; and (4) performing statistical analyses for the selected indicators. The report recommends additional testing. This document provides the appendices to the report. These appendices are: synopsis of process model; detailed results of statistical analysis; and signal processing analysis of daily power loss indicator

  13. Meta-analysis of prediction model performance across multiple studies: Which scale helps ensure between-study normality for the C-statistic and calibration measures?

    Science.gov (United States)

    Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D

    2017-01-01

    If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.

  14. Patient-specific estimation of detailed cochlear shape from clinical CT images

    DEFF Research Database (Denmark)

    Kjer, H Martin; Fagertun, Jens; Wimmer, Wilhelm

    2018-01-01

    of the detailed patient-specific cochlear shape from CT images. From a collection of temporal bone [Formula: see text]CT images, we build a cochlear statistical deformation model (SDM), which is a description of how a human cochlea deforms to represent the observed anatomical variability. The model is used...... for regularization of a non-rigid image registration procedure between a patient CT scan and a [Formula: see text]CT image, allowing us to estimate the detailed patient-specific cochlear shape. We test the accuracy and precision of the predicted cochlear shape using both [Formula: see text]CT and CT images...

  15. Statistical cluster analysis and diagnosis of nuclear system level performance

    International Nuclear Information System (INIS)

    Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.

    1985-01-01

    The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)

  16. Contribution statistics can make to "strengthening forensic science"

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2009-08-01

    Full Text Available draw on inputs from other countries and much of the report is relevant to forensic science in other countries. The report makes thirteen detailed recommendations, several of which will require statistics and statisticians for their implementation...

  17. Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.

    Science.gov (United States)

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.

  18. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  19. Statistical mechanics and the foundations of thermodynamics

    International Nuclear Information System (INIS)

    Loef, A.M.

    1979-01-01

    An introduction to classical statistical mechanics and its relation to thermodynamics is presented. Emphasis is put on getting a detailed and logical presentation of the foundations of thermodynamics based on the maximum entropy principles which govern the values taken by macroscopic variables according to the laws of large numbers

  20. Author Details

    African Journals Online (AJOL)

    Details PDF · Vol 22, No 2 (1999) - Articles Vegetation under different tree species in Acacia woodland in the Rift Valley of Ethiopia Details PDF · Vol 22, No 2 (1999) - Articles Preliminary evaluation of Phytomyza orobanchia (Diptera: Agromyzidae) as a controller of Orobanche spp in Ethiopia Details PDF. ISSN: 2520–7997.

  1. Annotations to quantum statistical mechanics

    CERN Document Server

    Kim, In-Gee

    2018-01-01

    This book is a rewritten and annotated version of Leo P. Kadanoff and Gordon Baym’s lectures that were presented in the book Quantum Statistical Mechanics: Green’s Function Methods in Equilibrium and Nonequilibrium Problems. The lectures were devoted to a discussion on the use of thermodynamic Green’s functions in describing the properties of many-particle systems. The functions provided a method for discussing finite-temperature problems with no more conceptual difficulty than ground-state problems, and the method was equally applicable to boson and fermion systems and equilibrium and nonequilibrium problems. The lectures also explained nonequilibrium statistical physics in a systematic way and contained essential concepts on statistical physics in terms of Green’s functions with sufficient and rigorous details. In-Gee Kim thoroughly studied the lectures during one of his research projects but found that the unspecialized method used to present them in the form of a book reduced their readability. He st...

  2. Airborne gamma-ray spectrometer and magnetometer survey, Durango D, Colorado. Final report Volume II A. Detail area

    International Nuclear Information System (INIS)

    1983-01-01

    This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data

  3. Airborne gamma-ray spectrometer and magnetometer survey, Durango B, Colorado. Final report Volume II A. Detail area

    International Nuclear Information System (INIS)

    1983-01-01

    The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data

  4. Statistical inferences for bearings life using sudden death test

    Directory of Open Access Journals (Sweden)

    Morariu Cristin-Olimpiu

    2017-01-01

    Full Text Available In this paper we propose a calculus method for reliability indicators estimation and a complete statistical inferences for three parameters Weibull distribution of bearings life. Using experimental values regarding the durability of bearings tested on stands by the sudden death tests involves a series of particularities of the estimation using maximum likelihood method and statistical inference accomplishment. The paper detailing these features and also provides an example calculation.

  5. Oil, Gas, Coal and Electricity - Quarterly statistics. Second Quarter 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-15

    This publication provides up-to-date and detailed quarterly statistics on oil, coal, natural gas and electricity for the OECD countries. Oil statistics cover production, trade, refinery intake and output, stock changes and consumption for crude oil, NGL and nine selected oil product groups. Statistics for electricity, natural gas, hard coal and brown coal show supply and trade. Import and export data are reported by origin and destination. Moreover, oil and hard coal production are reported on a worldwide basis.

  6. PORPST: A statistical postprocessor for the PORMC computer code

    International Nuclear Information System (INIS)

    Eslinger, P.W.; Didier, B.T.

    1991-06-01

    This report describes the theory underlying the PORPST code and gives details for using the code. The PORPST code is designed to do statistical postprocessing on files written by the PORMC computer code. The data written by PORMC are summarized in terms of means, variances, standard deviations, or statistical distributions. In addition, the PORPST code provides for plotting of the results, either internal to the code or through use of the CONTOUR3 postprocessor. Section 2.0 discusses the mathematical basis of the code, and Section 3.0 discusses the code structure. Section 4.0 describes the free-format point command language. Section 5.0 describes in detail the commands to run the program. Section 6.0 provides an example program run, and Section 7.0 provides the references. 11 refs., 1 fig., 17 tabs

  7. Statistical modelling for social researchers principles and practice

    CERN Document Server

    Tarling, Roger

    2008-01-01

    This book explains the principles and theory of statistical modelling in an intelligible way for the non-mathematical social scientist looking to apply statistical modelling techniques in research. The book also serves as an introduction for those wishing to develop more detailed knowledge and skills in statistical modelling. Rather than present a limited number of statistical models in great depth, the aim is to provide a comprehensive overview of the statistical models currently adopted in social research, in order that the researcher can make appropriate choices and select the most suitable model for the research question to be addressed. To facilitate application, the book also offers practical guidance and instruction in fitting models using SPSS and Stata, the most popular statistical computer software which is available to most social researchers. Instruction in using MLwiN is also given. Models covered in the book include; multiple regression, binary, multinomial and ordered logistic regression, log-l...

  8. Preference for Well-Balanced Saliency in Details Cropped from Photographs

    Science.gov (United States)

    Abeln, Jonas; Fresz, Leonie; Amirshahi, Seyed Ali; McManus, I. Chris; Koch, Michael; Kreysa, Helene; Redies, Christoph

    2016-01-01

    Photographic cropping is the act of selecting part of a photograph to enhance its aesthetic appearance or visual impact. It is common practice with both professional (expert) and amateur (non-expert) photographers. In a psychometric study, McManus et al. (2011b) showed that participants cropped photographs confidently and reliably. Experts tended to select details from a wider range of positions than non-experts, but other croppers did not generally prefer details that were selected by experts. It remained unclear, however, on what grounds participants selected particular details from a photograph while avoiding other details. One of the factors contributing to cropping decision may be visual saliency. Indeed, various saliency-based computer algorithms are available for the automatic cropping of photographs. However, careful experimental studies on the relation between saliency and cropping are lacking to date. In the present study, we re-analyzed the data from the studies by McManus et al. (2011a,b), focusing on statistical image properties. We calculated saliency-based measures for details selected and details avoided during cropping. As expected, we found that selected details contain regions of higher saliency than avoided details on average. Moreover, the saliency center-of-mass was closer to the geometrical center in selected details than in avoided details. Results were confirmed in an eye tracking study with the same dataset of images. Interestingly, the observed regularities in cropping behavior were less pronounced for experts than for non-experts. In summary, our results suggest that, during cropping, participants tend to select salient regions and place them in an image composition that is well-balanced with respect to the distribution of saliency. Our study contributes to the knowledge of perceptual bottom-up features that are germane to aesthetic decisions in photography and their variability in non-experts and experts. PMID:26793086

  9. Effect of the Target Motion Sampling Temperature Treatment Method on the Statistics and Performance

    Science.gov (United States)

    Viitanen, Tuomas; Leppänen, Jaakko

    2014-06-01

    Target Motion Sampling (TMS) is a stochastic on-the-fly temperature treatment technique that is being developed as a part of the Monte Carlo reactor physics code Serpent. The method provides for modeling of arbitrary temperatures in continuous-energy Monte Carlo tracking routines with only one set of cross sections stored in the computer memory. Previously, only the performance of the TMS method in terms of CPU time per transported neutron has been discussed. Since the effective cross sections are not calculated at any point of a transport simulation with TMS, reaction rate estimators must be scored using sampled cross sections, which is expected to increase the variances and, consequently, to decrease the figures-of-merit. This paper examines the effects of the TMS on the statistics and performance in practical calculations involving reaction rate estimation with collision estimators. Against all expectations it turned out that the usage of sampled response values has no practical effect on the performance of reaction rate estimators when using TMS with elevated basis cross section temperatures (EBT), i.e. the usual way. With 0 Kelvin cross sections a significant increase in the variances of capture rate estimators was observed right below the energy region of unresolved resonances, but at these energies the figures-of-merit could be increased using a simple resampling technique to decrease the variances of the responses. It was, however, noticed that the usage of the TMS method increases the statistical deviances of all estimators, including the flux estimator, by tens of percents in the vicinity of very strong resonances. This effect is actually not related to the usage of sampled responses, but is instead an inherent property of the TMS tracking method and concerns both EBT and 0 K calculations.

  10. Statistical strategies to reveal potential vibrational markers for in vivo analysis by confocal Raman spectroscopy

    Science.gov (United States)

    Oliveira Mendes, Thiago de; Pinto, Liliane Pereira; Santos, Laurita dos; Tippavajhala, Vamshi Krishna; Téllez Soto, Claudio Alberto; Martin, Airton Abrahão

    2016-07-01

    The analysis of biological systems by spectroscopic techniques involves the evaluation of hundreds to thousands of variables. Hence, different statistical approaches are used to elucidate regions that discriminate classes of samples and to propose new vibrational markers for explaining various phenomena like disease monitoring, mechanisms of action of drugs, food, and so on. However, the technical statistics are not always widely discussed in applied sciences. In this context, this work presents a detailed discussion including the various steps necessary for proper statistical analysis. It includes univariate parametric and nonparametric tests, as well as multivariate unsupervised and supervised approaches. The main objective of this study is to promote proper understanding of the application of various statistical tools in these spectroscopic methods used for the analysis of biological samples. The discussion of these methods is performed on a set of in vivo confocal Raman spectra of human skin analysis that aims to identify skin aging markers. In the Appendix, a complete routine of data analysis is executed in a free software that can be used by the scientific community involved in these studies.

  11. Airborne gamma-ray spectrometer and magnetometer survey, Durango A, Colorado. Final report Volume II A. Detail area

    International Nuclear Information System (INIS)

    1983-01-01

    This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data

  12. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  13. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1998-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  14. Atmospheric statistical dynamic models. Model performance: the Lawrence Livermore Laboratoy Zonal Atmospheric Model

    International Nuclear Information System (INIS)

    Potter, G.L.; Ellsaesser, H.W.; MacCracken, M.C.; Luther, F.M.

    1978-06-01

    Results from the zonal model indicate quite reasonable agreement with observation in terms of the parameters and processes that influence the radiation and energy balance calculations. The model produces zonal statistics similar to those from general circulation models, and has also been shown to produce similar responses in sensitivity studies. Further studies of model performance are planned, including: comparison with July data; comparison of temperature and moisture transport and wind fields for winter and summer months; and a tabulation of atmospheric energetics. Based on these preliminary performance studies, however, it appears that the zonal model can be used in conjunction with more complex models to help unravel the problems of understanding the processes governing present climate and climate change. As can be seen in the subsequent paper on model sensitivity studies, in addition to reduced cost of computation, the zonal model facilitates analysis of feedback mechanisms and simplifies analysis of the interactions between processes

  15. Statistical properties of a utility measure of observer performance compared to area under the ROC curve

    Science.gov (United States)

    Abbey, Craig K.; Samuelson, Frank W.; Gallas, Brandon D.; Boone, John M.; Niklason, Loren T.

    2013-03-01

    The receiver operating characteristic (ROC) curve has become a common tool for evaluating diagnostic imaging technologies, and the primary endpoint of such evaluations is the area under the curve (AUC), which integrates sensitivity over the entire false positive range. An alternative figure of merit for ROC studies is expected utility (EU), which focuses on the relevant region of the ROC curve as defined by disease prevalence and the relative utility of the task. However if this measure is to be used, it must also have desirable statistical properties keep the burden of observer performance studies as low as possible. Here, we evaluate effect size and variability for EU and AUC. We use two observer performance studies recently submitted to the FDA to compare the EU and AUC endpoints. The studies were conducted using the multi-reader multi-case methodology in which all readers score all cases in all modalities. ROC curves from the study were used to generate both the AUC and EU values for each reader and modality. The EU measure was computed assuming an iso-utility slope of 1.03. We find mean effect sizes, the reader averaged difference between modalities, to be roughly 2.0 times as big for EU as AUC. The standard deviation across readers is roughly 1.4 times as large, suggesting better statistical properties for the EU endpoint. In a simple power analysis of paired comparison across readers, the utility measure required 36% fewer readers on average to achieve 80% statistical power compared to AUC.

  16. Detailed Concepts in Performing Oversight on an Army Radiographic Inspection Site

    Science.gov (United States)

    2017-03-01

    details Image Acquisition / Review Station Manufacturer Model no. Serial no. Version Control no. Time stamp Acquisition VI3 VJ...pixel col. X # row) n/a Acquisition / Viewing Software n/a Frame Averaging n/a Digital filtering n/a Detector calibrations n/a Contrast to Noise...operational procedures. The system’s operations should be formally documented starting from the initial startup (i.e., beginning of a shift) all the

  17. Petroleum 2006. Statistical elements

    International Nuclear Information System (INIS)

    2007-06-01

    This document gathers in 5 parts, the main existing statistical data about petroleum industry in France and in the rest of the world, together with an insight on other energy sources: 1 - petroleum in the French economy (petroleum and other energies, petroleum and transports, petroleum and energy in the industry, the residential and tertiary sectors, environment: 42 pages); 2 - the French petroleum industry (exploration, production, foreign trade, transports, refining, storage, petrochemistry: 66 pages); 3 - the French market of petroleum products (evolution of sales by product and detail by region for the past year: 38 pages); 4 - prices and taxes of petroleum products (world prices and rates for crude and refined products, evolution of fret rates, retail prices and French taxes: 28 pages); 5 - petroleum in the world (world energy production and consumption, detailed petroleum activity by main areas and for the main countries: 112 pages). (J.S.)

  18. New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics

    International Nuclear Information System (INIS)

    Akhmatskaya, Elena; Reich, Sebastian

    2011-01-01

    We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)

  19. Obtaining Application-based and Content-based Internet Traffic Statistics

    DEFF Research Database (Denmark)

    Bujlow, Tomasz; Pedersen, Jens Myrup

    2012-01-01

    the Volunteer-Based System for Research on the Internet, developed at Aalborg University, is capable of providing detailed statistics of Internet usage. Since an increasing amount of HTTP traffic has been observed during the last few years, the system also supports creating statistics of different kinds of HTTP...... traffic, like audio, video, file transfers, etc. All statistics can be obtained for individual users of the system, for groups of users, or for all users altogether. This paper presents results with real data collected from a limited number of real users over six months. We demonstrate that the system can...

  20. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    Energy Technology Data Exchange (ETDEWEB)

    Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.

  1. Comparison of Statistical Methods for Detector Testing Programs

    Energy Technology Data Exchange (ETDEWEB)

    Rennie, John Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Abhold, Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    A typical goal for any detector testing program is to ascertain not only the performance of the detector systems under test, but also the confidence that systems accepted using that testing program’s acceptance criteria will exceed a minimum acceptable performance (which is usually expressed as the minimum acceptable success probability, p). A similar problem often arises in statistics, where we would like to ascertain the fraction, p, of a population of items that possess a property that may take one of two possible values. Typically, the problem is approached by drawing a fixed sample of size n, with the number of items out of n that possess the desired property, x, being termed successes. The sample mean gives an estimate of the population mean p ≈ x/n, although usually it is desirable to accompany such an estimate with a statement concerning the range within which p may fall and the confidence associated with that range. Procedures for establishing such ranges and confidence limits are described in detail by Clopper, Brown, and Agresti for two-sided symmetric confidence intervals.

  2. Identification of robust statistical downscaling methods based on a comprehensive suite of performance metrics for South Korea

    Science.gov (United States)

    Eum, H. I.; Cannon, A. J.

    2015-12-01

    Climate models are a key provider to investigate impacts of projected future climate conditions on regional hydrologic systems. However, there is a considerable mismatch of spatial resolution between GCMs and regional applications, in particular a region characterized by complex terrain such as Korean peninsula. Therefore, a downscaling procedure is an essential to assess regional impacts of climate change. Numerous statistical downscaling methods have been used mainly due to the computational efficiency and simplicity. In this study, four statistical downscaling methods [Bias-Correction/Spatial Disaggregation (BCSD), Bias-Correction/Constructed Analogue (BCCA), Multivariate Adaptive Constructed Analogs (MACA), and Bias-Correction/Climate Imprint (BCCI)] are applied to downscale the latest Climate Forecast System Reanalysis data to stations for precipitation, maximum temperature, and minimum temperature over South Korea. By split sampling scheme, all methods are calibrated with observational station data for 19 years from 1973 to 1991 are and tested for the recent 19 years from 1992 to 2010. To assess skill of the downscaling methods, we construct a comprehensive suite of performance metrics that measure an ability of reproducing temporal correlation, distribution, spatial correlation, and extreme events. In addition, we employ Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) to identify robust statistical downscaling methods based on the performance metrics for each season. The results show that downscaling skill is considerably affected by the skill of CFSR and all methods lead to large improvements in representing all performance metrics. According to seasonal performance metrics evaluated, when TOPSIS is applied, MACA is identified as the most reliable and robust method for all variables and seasons. Note that such result is derived from CFSR output which is recognized as near perfect climate data in climate studies. Therefore, the

  3. Effect of Task Presentation on Students' Performances in Introductory Statistics Courses

    Science.gov (United States)

    Tomasetto, Carlo; Matteucci, Maria Cristina; Carugati, Felice; Selleri, Patrizia

    2009-01-01

    Research on academic learning indicates that many students experience major difficulties with introductory statistics and methodology courses. We hypothesized that students' difficulties may depend in part on the fact that statistics tasks are commonly viewed as related to the threatening domain of math. In two field experiments which we carried…

  4. Eulerian and Lagrangian statistics from high resolution numerical simulations of weakly compressible turbulence

    NARCIS (Netherlands)

    Benzi, R.; Biferale, L.; Fisher, R.T.; Lamb, D.Q.; Toschi, F.

    2009-01-01

    We report a detailed study of Eulerian and Lagrangian statistics from high resolution Direct Numerical Simulations of isotropic weakly compressible turbulence. Reynolds number at the Taylor microscale is estimated to be around 600. Eulerian and Lagrangian statistics is evaluated over a huge data

  5. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects

  6. Timber resource statistics for eastern Washington, 1995.

    Science.gov (United States)

    Neil McKay; Patricia M. Bassett; Colin D. MacLean

    1995-01-01

    This report summarizes a 1990-91 timber resource inventory of Washington east of the crest of the Cascade Range. The inventory was conducted on all private and public lands except National Forests. Timber resource statistics from National Forest inventories also are presented. Detailed tables provide estimates of forest area, timber volume, growth, mortality, and...

  7. KNOW YOUR NEIGHBORHOOD: A DETAILED MODEL ATMOSPHERE ANALYSIS OF NEARBY WHITE DWARFS

    Energy Technology Data Exchange (ETDEWEB)

    Giammichele, N.; Bergeron, P. [Kitt Peak National Observatory, National Optical Astronomy Observatory, which is operated by the Association of Universities for Research in Astronomy (AURA) under cooperative agreement with the National Science Foundation (United States); Dufour, P., E-mail: noemi.giammichele@astro.umontreal.ca, E-mail: pierre.bergeron@astro.umontreal.ca, E-mail: patrick.dufour@astro.umontreal.ca [Departement de Physique, Universite de Montreal, C.P. 6128, Succ. Centre-Ville, Montreal, Quebec H3C 3J7 (Canada)

    2012-04-01

    We present improved atmospheric parameters of nearby white dwarfs lying within 20 pc of the Sun. The aim of the current study is to obtain the best statistical model of the least-biased sample of the white dwarf population. A homogeneous analysis of the local population is performed combining detailed spectroscopic and photometric analyses based on improved model atmosphere calculations for various spectral types including DA, DB, DC, DQ, and DZ stars. The spectroscopic technique is applied to all stars in our sample for which optical spectra are available. Photometric energy distributions, when available, are also combined to trigonometric parallax measurements to derive effective temperatures, stellar radii, as well as atmospheric compositions. A revised catalog of white dwarfs in the solar neighborhood is presented. We provide, for the first time, a comprehensive analysis of the mass distribution and the chemical distribution of white dwarf stars in a volume-limited sample.

  8. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  9. Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators.

    Science.gov (United States)

    Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H-S; Ahn, Jaewook

    2018-05-04

    Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.

  10. Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators

    Science.gov (United States)

    Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H.-S.; Ahn, Jaewook

    2018-05-01

    Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.

  11. Applied statistics for economics and business

    CERN Document Server

    Özdemir, Durmuş

    2016-01-01

    This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.

  12. Statistical and physical evolution of QSO's

    International Nuclear Information System (INIS)

    Caditz, D.; Petrosian, V.

    1989-09-01

    The relationship between the physical evolution of discrete extragalactic sources, the statistical evolution of the observed population of sources, and the cosmological model is discussed. Three simple forms of statistical evolution: pure luminosity evolution (PLE), pure density evolution (PDE), and generalized luminosity evolution (GLE), are considered in detail together with what these forms imply about the physical evolution of individual sources. Two methods are used to analyze the statistical evolution of the observed distribution of QSO's (quasars) from combined flux limited samples. It is shown that both PLE and PDE are inconsistent with the data over the redshift range 0 less than z less than 2.2, and that a more complicated form of evolution such as GLE is required, independent of the cosmological model. This result is important for physical models of AGN, and in particular, for the accretion disk model which recent results show may be inconsistent with PLE

  13. Statistical approach to quantum field theory. An introduction

    International Nuclear Information System (INIS)

    Wipf, Andreas

    2013-01-01

    Based on course-tested notes and pedagogical in style. Authored by a leading researcher in the field. Contains end-of-chapter problems and listings of short, useful computer programs. Authored by a leading researcher in the field. Contains end-of-chapter problems and listings of short, useful computer programs. Contains end-of-chapter problems and listings of short, useful computer programs. Over the past few decades the powerful methods of statistical physics and Euclidean quantum field theory have moved closer together, with common tools based on the use of path integrals. The interpretation of Euclidean field theories as particular systems of statistical physics has opened up new avenues for understanding strongly coupled quantum systems or quantum field theories at zero or finite temperatures. Accordingly, the first chapters of this book contain a self-contained introduction to path integrals in Euclidean quantum mechanics and statistical mechanics. The resulting high-dimensional integrals can be estimated with the help of Monte Carlo simulations based on Markov processes. The most commonly used algorithms are presented in detail so as to prepare the reader for the use of high-performance computers as an ''experimental'' tool for this burgeoning field of theoretical physics. Several chapters are then devoted to an introduction to simple lattice field theories and a variety of spin systems with discrete and continuous spins, where the ubiquitous Ising model serves as an ideal guide for introducing the fascinating area of phase transitions. As an alternative to the lattice formulation of quantum field theories, variants of the flexible renormalization group methods are discussed in detail. Since, according to our present-day knowledge, all fundamental interactions in nature are described by gauge theories, the remaining chapters of the book deal with gauge theories without and with matter. This text is based on course-tested notes for graduate students and, as

  14. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  15. Swiss electricity statistics 2000

    International Nuclear Information System (INIS)

    2001-01-01

    This publication by the Association of Swiss Electricity Enterprises for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity production, trading and consumption in Switzerland in 2000. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2000, the data being supplied for each hydrological year and the summer and winter seasons respectively. The production of power in Switzerland is examined in detail. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2000 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The final two chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity

  16. Energy Statistics Manual; Manual de Estadisticas Energeticas

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  17. Statistics of wind direction and its increments

    International Nuclear Information System (INIS)

    Doorn, Eric van; Dhruva, Brindesh; Sreenivasan, Katepalli R.; Cassella, Victor

    2000-01-01

    We study some elementary statistics of wind direction fluctuations in the atmosphere for a wide range of time scales (10 -4 sec to 1 h), and in both vertical and horizontal planes. In the plane parallel to the ground surface, the direction time series consists of two parts: a constant drift due to large weather systems moving with the mean wind speed, and fluctuations about this drift. The statistics of the direction fluctuations show a rough similarity to Brownian motion but depend, in detail, on the wind speed. This dependence manifests itself quite clearly in the statistics of wind-direction increments over various intervals of time. These increments are intermittent during periods of low wind speeds but Gaussian-like during periods of high wind speeds. (c) 2000 American Institute of Physics

  18. Energy Statistics Manual; Enerji Istatistikleri El Kitabi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  19. Swiss electricity statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2003. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the article also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2003, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2003 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  20. Swiss electricity statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2003. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2003, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2003 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  1. Swiss electricity statistics 2002

    International Nuclear Information System (INIS)

    2003-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2002. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2002, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2002 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2009. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  2. Swiss electricity statistics 2004

    International Nuclear Information System (INIS)

    2005-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2004. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2004, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2004 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  3. Swiss electricity statistics 2002

    International Nuclear Information System (INIS)

    Swiss Federal Office of Energy, Berne

    2003-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2002. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the article also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2002, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2002 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2009. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  4. Swiss electricity statistics 2004

    International Nuclear Information System (INIS)

    2005-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2004. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the article also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2004, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2004 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  5. Joint statistics of partial sums of ordered exponential variates and performance of GSC RAKE receivers over rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2011-08-01

    Spread spectrum receivers with generalized selection combining (GSC) RAKE reception were proposed and have been studied as alternatives to the classical two fundamental schemes: maximal ratio combining and selection combining because the number of diversity paths increases with the transmission bandwidth. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such as the exact performance analysis of the capture probability and an exact assessment of the impact of self-interference on GSC RAKE receivers. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability and outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels, and compare it to that of partial RAKE receivers. © 2011 IEEE.

  6. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  7. [Comparison of film-screen combination in a contrast detail diagram and with interactive image analysis. 1: Contrast detail diagram].

    Science.gov (United States)

    Hagemann, G; Eichbaum, G

    1997-07-01

    The following three film-screen combinations were compared: a) a combination of anticrossover film and UV-light emitting screens, b) a combination of blue-light emitting screens and film, and c) a conventional green fluorescing screen film combination. Radiographs of a specially designed plexiglass phantom (0.2 x 0.2 x 0.12 m3) were obtained that contained bar patterns of lead and plaster (calcium sulfate) to test high and intermediate contrast resolution and bar patterns of air to test low contrast resolution, respectively. An aluminum step wedge was integrated to evaluate dose-density curves of the radiographs. The dose values for the various step thicknesses were measured as percentage of the dose value in air for 60, 81, and 117 kV. Exposure conditions were the following: 12 pulse generator, 0.6 mm focus size, 4.7 mm aluminum prefilter, a grid with 40 lines/cm (12:1), and a focus-detector distance of 1.15 m. The thresholds of visible bars of the various pattern materials were assessed by seven radiologists, one technician, and the authors. The resulting contrast detail diagram could not prove any significant differences between the three tested screen film combinations. The pairwise comparison, however, found 8 of the 18 paired differences to be statistically significant between the conventional and the two new screen-film combinations. The authors concluded that subjective visual assessment of the threshold in a contrast detail study alone is of only limited value to grade image quality if no well-defined criteria are used (BIR report 20 [1989] 137-139). The statistical approach of paired differences of the estimated means appeared to be more appropriate.

  8. Study of statistical properties of hybrid statistic in coherent multi-detector compact binary coalescences Search

    OpenAIRE

    Haris, K; Pai, Archana

    2015-01-01

    In this article, we revisit the problem of coherent multi-detector search of gravitational wave from compact binary coalescence with Neutron stars and Black Holes using advanced interferometers like LIGO-Virgo. Based on the loss of optimal multi-detector signal-to-noise ratio (SNR), we construct a hybrid statistic as a best of maximum-likelihood-ratio(MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybrid statistic is studied. The performance of this ...

  9. Airborne gamma-ray spectrometer and magnetometer survey, Durango A, B, C, and D, Colorado. Volume I. Detail area. Final report

    International Nuclear Information System (INIS)

    1983-01-01

    An airborne combined radiometric and magnetic survey was performed for the Department of Energy (DOE) over the Durango A, Durango B, Durango C, and Durango D Detail Areas of southwestern Colorado. The Durango A Detail Area is within the coverage of the Needle Mountains and Silverton 15' map sheets, and the Pole Creek Mountain, Rio Grande Pyramid, Emerald Lake, Granite Peak, Vallecito Reservoir, and Lemon Reservoir 7.5' map sheets of the National Topographic Map Series (NTMS). The Durango B Detail Area is within the coverage of the Silverton 15' map sheet and the Wetterhorn Peak, Uncompahgre Peak, Lake City, Redcloud Peak, Lake San Cristobal, Pole Creek Mountain, and Finger Mesa 7.5' map sheets of the NTMS. The Durango C Detail Area is within the coverage of the Platoro and Wolf Creek Pass 15' map sheets of the NTMS. The Durango D Detail Area is within the coverage of the Granite Lake, Cimarrona Peak, Bear Mountain, and Oakbrush Ridge 7.5' map sheets of the NTMS. Radiometric data were corrected for live time, aircraft and equipment background, cosmic background, atmospheric radon, Compton scatter, and altitude dependence. The corrected data were statistically evaluated, gridded, and contoured to produce maps of the radiometric variables, uranium, potassium, and thorium; their ratios; and the residual magnetic field. These maps have been analyzed in order to produce a multi-variant analysis contour map based on the radiometric response of the individual geological units. A geochemical analysis has been performed, using the radiometric and magnetic contour maps, the multi-variant analysis map, and factor analysis techniques, to produce a geochemical analysis map for the area

  10. On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method

    International Nuclear Information System (INIS)

    Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.

    1985-01-01

    The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt

  11. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  12. Nuclear material statistical accountancy system

    International Nuclear Information System (INIS)

    Argentest, F.; Casilli, T.; Franklin, M.

    1979-01-01

    The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase

  13. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... An algorithm to retrieve Land Surface Temperature using Landsat-8 Dataset Abstract PDF. ISSN: 2225-8531.

  14. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1989-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  15. Have Basic Mathematical Skills Grown Obsolete in the Computer Age: Assessing Basic Mathematical Skills and Forecasting Performance in a Business Statistics Course

    Science.gov (United States)

    Noser, Thomas C.; Tanner, John R.; Shah, Situl

    2008-01-01

    The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…

  16. Main: Clone Detail [KOME

    Lifescience Database Archive (English)

    Full Text Available Clone Detail Mapping Pseudomolecule data detail Detail information Mapping to the T...IGR japonica Pseudomolecules kome_mapping_pseudomolecule_data_detail.zip kome_mapping_pseudomolecule_data_detail ...

  17. Author Details

    African Journals Online (AJOL)

    PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL · RESOURCES. Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads.

  18. Design and performance characteristics of solar adsorption refrigeration system using parabolic trough collector: Experimental and statistical optimization technique

    International Nuclear Information System (INIS)

    Abu-Hamdeh, Nidal H.; Alnefaie, Khaled A.; Almitani, Khalid H.

    2013-01-01

    Highlights: • The successes of using olive waste/methanol as an adsorbent/adsorbate pair. • The experimental gross cycle coefficient of performance obtained was COP a = 0.75. • Optimization showed expanding adsorbent mass to a certain range increases the COP. • The statistical optimization led to optimum tank volume between 0.2 and 0.3 m 3 . • Increasing the collector area to a certain range increased the COP. - Abstract: The current work demonstrates a developed model of a solar adsorption refrigeration system with specific requirements and specifications. The recent scheme can be employed as a refrigerator and cooler unit suitable for remote areas. The unit runs through a parabolic trough solar collector (PTC) and uses olive waste as adsorbent with methanol as adsorbate. Cooling production, COP (coefficient of performance, and COP a (cycle gross coefficient of performance) were used to assess the system performance. The system’s design optimum parameters in this study were arrived to through statistical and experimental methods. The lowest temperature attained in the refrigerated space was 4 °C and the equivalent ambient temperature was 27 °C. The temperature started to decrease steadily at 20:30 – when the actual cooling started – until it reached 4 °C at 01:30 in the next day when it rose again. The highest COP a obtained was 0.75

  19. Selection of the Maximum Spatial Cluster Size of the Spatial Scan Statistic by Using the Maximum Clustering Set-Proportion Statistic.

    Science.gov (United States)

    Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong

    2016-01-01

    Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.

  20. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  1. Changing statistics of storms in the North Atlantic?

    International Nuclear Information System (INIS)

    Storch, H. von; Guddal, J.; Iden, K.A.; Jonsson, T.; Perlwitz, J.; Reistad, M.; Ronde, J. de; Schmidt, H.; Zorita, E.

    1993-01-01

    Problems in the present discussion about increasing storminess in the North Atlantic area are discusesd. Observational data so far available do not indicate a change in the storm statistics. Output from climate models points to an itensified storm track in the North Atlantic, but because of the limited skill of present-day climate models in simulating high-frequency variability and regional details any such 'forecast' has to be considered with caution. A downscaling procedure which relates large-scale time-mean aspects of the state of the atmosphere and ocean to the local statistics of storms is proposed to reconstruct past variations of high-frequency variability in the atmosphere (storminess) and in the sea state (wave statistics). First results are presented. (orig.)

  2. Swiss energy statistics 2006

    International Nuclear Information System (INIS)

    2007-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2006. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2006 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  3. Swiss energy statistics 2004

    International Nuclear Information System (INIS)

    2005-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2004. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2004 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  4. Swiss energy statistics 2005

    International Nuclear Information System (INIS)

    2006-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2005. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2005 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  5. Swiss energy statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2003. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2003 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  6. Swiss energy statistics 2002

    International Nuclear Information System (INIS)

    2003-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2002. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2002 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  7. Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RGP estmap2001 Statistics information of rice EST mapping results Data detail Data name Statistics...of This Database Site Policy | Contact Us Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive ...

  8. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  9. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  10. Falling in the elderly: Do statistical models matter for performance criteria of fall prediction? Results from two large population-based studies.

    Science.gov (United States)

    Kabeshova, Anastasiia; Launay, Cyrille P; Gromov, Vasilii A; Fantino, Bruno; Levinoff, Elise J; Allali, Gilles; Beauchet, Olivier

    2016-01-01

    To compare performance criteria (i.e., sensitivity, specificity, positive predictive value, negative predictive value, area under receiver operating characteristic curve and accuracy) of linear and non-linear statistical models for fall risk in older community-dwellers. Participants were recruited in two large population-based studies, "Prévention des Chutes, Réseau 4" (PCR4, n=1760, cross-sectional design, retrospective collection of falls) and "Prévention des Chutes Personnes Agées" (PCPA, n=1765, cohort design, prospective collection of falls). Six linear statistical models (i.e., logistic regression, discriminant analysis, Bayes network algorithm, decision tree, random forest, boosted trees), three non-linear statistical models corresponding to artificial neural networks (multilayer perceptron, genetic algorithm and neuroevolution of augmenting topologies [NEAT]) and the adaptive neuro fuzzy interference system (ANFIS) were used. Falls ≥1 characterizing fallers and falls ≥2 characterizing recurrent fallers were used as outcomes. Data of studies were analyzed separately and together. NEAT and ANFIS had better performance criteria compared to other models. The highest performance criteria were reported with NEAT when using PCR4 database and falls ≥1, and with both NEAT and ANFIS when pooling data together and using falls ≥2. However, sensitivity and specificity were unbalanced. Sensitivity was higher than specificity when identifying fallers, whereas the converse was found when predicting recurrent fallers. Our results showed that NEAT and ANFIS were non-linear statistical models with the best performance criteria for the prediction of falls but their sensitivity and specificity were unbalanced, underscoring that models should be used respectively for the screening of fallers and the diagnosis of recurrent fallers. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  11. Line radiative transfer and statistical equilibrium*

    Directory of Open Access Journals (Sweden)

    Kamp Inga

    2015-01-01

    Full Text Available Atomic and molecular line emission from protoplanetary disks contains key information of their detailed physical and chemical structures. To unravel those structures, we need to understand line radiative transfer in dusty media and the statistical equilibrium, especially of molecules. I describe here the basic principles of statistical equilibrium and illustrate them through the two-level atom. In a second part, the fundamentals of line radiative transfer are introduced along with the various broadening mechanisms. I explain general solution methods with their drawbacks and also specific difficulties encountered in solving the line radiative transfer equation in disks (e.g. velocity gradients. I am closing with a few special cases of line emission from disks: Radiative pumping, masers and resonance scattering.

  12. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  13. Advanced statistics for tokamak transport colinearity and tokamak to tokamak variation

    International Nuclear Information System (INIS)

    Riedel, K.S.

    1989-03-01

    This is a compendium of three separate articles on the statistical analysis of tokamak transport. The first article is an expository introduction to advanced statistics and scaling laws. The second analyzes two important problems of tokamak data---colinearity and tokamak to tokamak variation in detail. The third article generalizes the Swamy random coefficient model to the case of degenerate matrices. Three papers have been processed separately

  14. Author Details

    African Journals Online (AJOL)

    An Overview of Africa's Marine Resources: Their Utilization and Sustainable Management Details · Vol 12, No 3 (2000) - Articles EDITORIAL Ganoderma Lucidum - Paramount among Medicinal Mushrooms. Details · Vol 15, No 3 (2003) - Articles Editorial: Africa's Mushrooms: A neglected bioresource whose time has come

  15. Detailed impedance characterization of a well performing and durable Ni:CGO infiltrated cermet anode for metal-supported solid oxide fuel cells

    DEFF Research Database (Denmark)

    Nielsen, Jimmi; Klemensø, Trine; Blennow Tullmar, Peter

    2012-01-01

    Further knowledge of the novel, well performing and durable Ni:CGO infiltrated cermet anode for metal supported fuel cells has been acquired by means of a detailed impedance spectroscopy study. The anode impedance was shown to consist of three arcs. Porous electrode theory (PET) represented...... as a transmission line response could account for the intermediate frequency arc. The PET model enabled a detailed insight into the effect of adding minor amounts of Ni into the infiltrated CGO and allowed an estimation of important characteristics such as the electrochemical utilization thickness of the anode...... of the infiltrated submicron sized particles was surprisingly robust. TEM analysis revealed the nano sized Ni particles to be trapped within the CGO matrix, which along the self limiting grain growth of the CGO seem to be able to stabilize the submicron structured anode....

  16. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    Unknown

    3Human Genetics Division, School of Medicine, University of Southampton, Southampton SO16 6YD, UK. Abstract ... that the statistic S-#alleles gives good performance for recessive ... (H50) of the families are linked to the single marker. The.

  17. Statistical Research on the Bioactivity of New Marine Natural Products Discovered during the 28 Years from 1985 to 2012

    Science.gov (United States)

    Hu, Yiwen; Chen, Jiahui; Hu, Guping; Yu, Jianchen; Zhu, Xun; Lin, Yongcheng; Chen, Shengping; Yuan, Jie

    2015-01-01

    Every year, hundreds of new compounds are discovered from the metabolites of marine organisms. Finding new and useful compounds is one of the crucial drivers for this field of research. Here we describe the statistics of bioactive compounds discovered from marine organisms from 1985 to 2012. This work is based on our database, which contains information on more than 15,000 chemical substances including 4196 bioactive marine natural products. We performed a comprehensive statistical analysis to understand the characteristics of the novel bioactive compounds and detail temporal trends, chemical structures, species distribution, and research progress. We hope this meta-analysis will provide useful information for research into the bioactivity of marine natural products and drug development. PMID:25574736

  18. Statistical Physics of Neural Systems with Nonadditive Dendritic Coupling

    Directory of Open Access Journals (Sweden)

    David Breuer

    2014-03-01

    Full Text Available How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such nonadditive dendritic processing on single-neuron responses and the performance of associative-memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality.

  19. Swiss electricity statistics 2001

    International Nuclear Information System (INIS)

    2002-01-01

    This publication by the Association of Swiss Electricity Enterprises for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity production, trading and consumption in Switzerland in 2001. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2001, the data being supplied for each hydrological year and the summer and winter seasons respectively. The production of power in Switzerland is examined in detail. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2001 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The final two chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity

  20. Statistical Estimation of Heterogeneities: A New Frontier in Well Testing

    Science.gov (United States)

    Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.

    2001-12-01

    Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.

  1. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  2. Detailed analysis of the KAERI nTOF facility

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Woon; Lee, Young Ouk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-06-15

    A project for building a neutron time-of-flight (nTOF) facility is progressing. We expect that the construction will start in early 2016. Before that, a detailed simulation based on the current architectural drawings was performed to optimize the performance of our facility. Currently, several parts had been modified or changed from the original design to reflect requirements such as the layout of the electron beam line, shape of the vacuum chamber producing a neutron beam, and the underground layout of the nTOF facility. Detailed analysis for these modifications has been done with MCNP simulation. An overview of our photo-neutron source and KAERI nTOF facility were introduced. The numerical simulations for heat deposition, source term, and radiation shielding of KAERI nTOF facility were performed and the results are discussed. We are expecting that the construction of the KAERI nTOF facility will start in early 2016, and these results will be used as basic data.

  3. 2012 Aerospace Medical Certification Statistical Handbook

    Science.gov (United States)

    2013-12-01

    2012 Aerospace Medical Certification Statistical Handbook Valerie J. Skaggs Ann I. Norris Civil Aerospace Medical Institute Federal Aviation...Certification Statistical Handbook December 2013 6. Performing Organization Code 7. Author(s) 8. Performing Organization Report No. Skaggs VJ, Norris AI 9...2.57 Hayfever 14,477 2.49 Asthma 12,558 2.16 Other general heart pathology (abnormal ECG, open heart surgery, etc.). Wolff-Parkinson-White syndrome

  4. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  5. Author Details

    African Journals Online (AJOL)

    Petrology of the Cenomanian Upper Member of the Mamfe Embayment, southwestern Cameroon Details · Vol 38, No 1 (2002) - Articles Sequence stratigraphy of Iso field, western onshore Niger Delta, Nigeria Details · Vol 39, No 2 (2003) - Articles Preliminary studies on the lithostratigraphy and depositional environment of ...

  6. Introduction to quantum statistical mechanics

    CERN Document Server

    Bogolyubov, N N

    2010-01-01

    Introduction to Quantum Statistical Mechanics (Second Edition) may be used as an advanced textbook by graduate students, even ambitious undergraduates in physics. It is also suitable for non experts in physics who wish to have an overview of some of the classic and fundamental quantum models in the subject. The explanation in the book is detailed enough to capture the interest of the reader, and complete enough to provide the necessary background material needed to dwell further into the subject and explore the research literature.

  7. Computational statistics and biometry: which discipline drives which?

    Directory of Open Access Journals (Sweden)

    Edler, Lutz

    2005-06-01

    Full Text Available A biometrician's work is defined through the biological or medical problem and the mathematical and statistical methods needed for its solution. This requires in most instances statistical data analysis and the use of methods of computational statistics. At first, it seems quite obvious that the computational needs of the biometric problem determine what has to be developed by the discipline of computational statistics. However, viewing the development of biometry and computational statistics in Germany for the past decades in more details reveals an interesting interaction between the activities of the German Region of the International Biometric Society and groups engaged in computational statistics within Germany. Exact methods of statistical inference and permutation tests, simulations and the use of the Bootstrap, and interactive graphical statistical methods are examples of this fruitful reciprocal benefit. This contribution examines therefore relationships between the historical development of biometry and computational statistics in Germany using as sources of information contributions to the scientific literature, presentations and sessions at scientific conferences on biometry and on computational statistics which influenced the development of both disciplines and exhibits a reciprocal dependency. The annual workshops organized on the Reisensburg now for more than 30 years are recognized as an outstanding factor of this interrelationship. This work aims at the definition of the present status of computational statistics in the German Region of the International Biometric Society and intends to guide and to foster the discussion of the future development of this discipline among biometricians.

  8. Excel 2016 for advertising statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2017-01-01

    This text is a step-by-step guide for students taking a first course in statistics for advertising and for advertising managers and practitioners who want to learn how to use Excel to solve practical statistics problems in in the workplace, whether or not they have taken a course in statistics. Excel 2016 for Advertising Statistics explains statistical formulas and offers practical examples for how students can solve real-world advertising statistics problems. This book leaves detailed explanations of statistical theory to other statistics textbooks and focuses entirely on practical, real-world problem solving. Each chapter briefly explains a topic and then demonstrates how to use Excel commands and formulas to solve specific advertising statistics problems.  This book gives practice in using Excel in two different ways:  (1) writing formulas (e.g., confidence interval about the mean, one-group t-test, two-group t-test, correlation) and (2) using Excel’s drop-down formula menus (e.g., simple linear regres...

  9. Wooden houses in detail. Holzhaeuser im Detail

    Energy Technology Data Exchange (ETDEWEB)

    Ruske, W. (ed.)

    1986-01-01

    Under the serial title 'Planning and construction of wooden houses', WEKA will publish a number of books of which this is the first. Details of design and construction are presented, e.g.: Details of modern one-family houses; Fundamentals of design and hints for planning of wooden houses and compact wooden structures; Constructional ecology, wood protection, thermal insulation, sound insulation; Modular systems for domestic buildings; The 'bookshelf-type' house at the Berlin International Construction Exhibition (IBA); Experience with do-it-yourself systems. With 439 figs.

  10. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  12. The statistics of dose/cure relationships for irradiated tumours

    International Nuclear Information System (INIS)

    Porter, E.H.

    1980-01-01

    Attention is given to the statistical analysis of dose/cure experiments. The simplest possible theory is developed in detail with special attention to experimental design and to the range of validity of the methods advocated. Explanations are aimed at the mathematics-tolerant, not at the mathematician. (author)

  13. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Singh, J. Vol 3, No 2 (2011) - Articles Plane waves in a rotating generalized thermo-elastic solid with voids. Abstract PDF. ISSN: 2141-2839. AJOL African Journals Online. HOW TO USE AJOL.

  14. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 12 (2008) - Articles On the wave equations of shallow water with rough bottom topography. Abstract · Vol 14 (2009) - Articles Energy generation in a plant due to variable sunlight intensity

  15. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 45 (2016) - Articles From vectors to waves and streams: An alternative approach to semantic maps1. Abstract PDF · Vol 48 (2017) - Articles Introduction: 'n Klein ietsie for Johan Oosthuizen

  16. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... to blast loadings. Abstract PDF · Vol 9, No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF. ISSN: 1112-9867.

  17. [[The Devil in the Details: Women's Right to Abortion and Health Organization].

    Science.gov (United States)

    Pioggia, Alessandra

    Often a woman's right to terminate a pregnancy for health reasons is considered as achieved by simply performing the intervention. But today isn't in doubt that the effective protection of health requires that health organizations carrying out performance which also affect other aspects: taking charge of women, information on services, respect for the dignity and autonomy of women, etc ... You could say that these are details, compared to the final performance. But, as we know, often the devil is in the details.

  18. Phase flow and statistical structure of Galton-board systems

    International Nuclear Information System (INIS)

    Lue, A.; Brenner, H.

    1993-01-01

    Galton boards, found in museum exhibits devoted to science and technology, are often used to demonstrate visually the ubiquity of so-called ''laws of probability'' via an experimental realization of normal distributions. A detailed theoretical study of Galton-board phase-space dynamics and statistical behavior is presented. The study is based on a simple inelastic-collision model employing a particle fall- ing through a spatially periodic lattice of rigid, convex scatterers. We show that such systems exhibit indeterminate behavior through the presence of strange attractors or strange repellers in phase space; nevertheless, we also show that these systems exhibit regular and predictable behavior under specific circumstances. Phase-space strange attractors, periodic attractors, and strange repellers are present in numerical simulations, confirming results anticipated from geometric analysis. The system's geometry (dictated by lattice geometry and density as well as the direction of gravity) is observed to play a dominant role in stability, phase-flow topology, and statistical observations. Smale horseshoes appear to exist in the low-lattice-density limit and may exist in other regimes. These horseshoes are generated by homoclinic orbits whose existence is dictated by system characteristics. The horseshoes lead directly to deterministic chaos in the system. Strong evidence exists for ergodicity in all attractors. Phase-space complexities are manifested at all observed levels, particularly statistical ones. Consequently, statistical observations are critically dependent upon system details. Under well-defined circumstances, these observations display behavior which does not constitute a realization of the ''laws of probability.''

  19. Enhanced visual statistical learning in adults with autism

    Science.gov (United States)

    Roser, Matthew E.; Aslin, Richard N.; McKenzie, Rebecca; Zahra, Daniel; Fiser, József

    2014-01-01

    Individuals with autism spectrum disorder (ASD) are often characterized as having social engagement and language deficiencies, but a sparing of visuo-spatial processing and short-term memory, with some evidence of supra-normal levels of performance in these domains. The present study expanded on this evidence by investigating the observational learning of visuospatial concepts from patterns of covariation across multiple exemplars. Child and adult participants with ASD, and age-matched control participants, viewed multi-shape arrays composed from a random combination of pairs of shapes that were each positioned in a fixed spatial arrangement. After this passive exposure phase, a post-test revealed that all participant groups could discriminate pairs of shapes with high covariation from randomly paired shapes with low covariation. Moreover, learning these shape-pairs with high covariation was superior in adults with ASD than in age-matched controls, while performance in children with ASD was no different than controls. These results extend previous observations of visuospatial enhancement in ASD into the domain of learning, and suggest that enhanced visual statistical learning may have arisen from a sustained bias to attend to local details in complex arrays of visual features. PMID:25151115

  20. Introduction of a Journal Excerpt Activity Improves Undergraduate Students' Performance in Statistics

    Science.gov (United States)

    Rabin, Laura A.; Nutter-Upham, Katherine E.

    2010-01-01

    We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…

  1. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Iliopsoas haematoma in a rugby player. Abstract PDF · Vol 29, No 1 (2017) - Articles The use of negative pressure wave treatment in athlete recovery. Abstract PDF. ISSN: 2078-516X. AJOL African ...

  2. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF · Vol 9, No 3S (2017): Special Issue - Articles Simulation on ...

  3. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Duwa, S S. Vol 8 (2004) - Articles Lower hybrid waves instability in a velocity–sheared inhomogenous charged dust beam. Abstract · Vol 9 (2005) - Articles The slide away theory of lower hybrid bursts

  4. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... The use of negative pressure wave treatment in athlete recovery. Abstract PDF · Vol 29, No 1 (2017) - Articles The prevalence, risk factors predicting injury and the severity of injuries sustained during ...

  5. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 29, No 1 (2017) - Articles The use of negative pressure wave treatment in athlete recovery. Abstract PDF · Vol 29, No 1 (2017) - Articles The prevalence, risk factors predicting injury and the ...

  6. Propagation of statistical and nuclear data uncertainties in Monte Carlo burn-up calculations

    International Nuclear Information System (INIS)

    Garcia-Herranz, Nuria; Cabellos, Oscar; Sanz, Javier; Juan, Jesus; Kuijper, Jim C.

    2008-01-01

    Two methodologies to propagate the uncertainties on the nuclide inventory in combined Monte Carlo-spectrum and burn-up calculations are presented, based on sensitivity/uncertainty and random sampling techniques (uncertainty Monte Carlo method). Both enable the assessment of the impact of uncertainties in the nuclear data as well as uncertainties due to the statistical nature of the Monte Carlo neutron transport calculation. The methodologies are implemented in our MCNP-ACAB system, which combines the neutron transport code MCNP-4C and the inventory code ACAB. A high burn-up benchmark problem is used to test the MCNP-ACAB performance in inventory predictions, with no uncertainties. A good agreement is found with the results of other participants. This benchmark problem is also used to assess the impact of nuclear data uncertainties and statistical flux errors in high burn-up applications. A detailed calculation is performed to evaluate the effect of cross-section uncertainties in the inventory prediction, taking into account the temporal evolution of the neutron flux level and spectrum. Very large uncertainties are found at the unusually high burn-up of this exercise (800 MWd/kgHM). To compare the impact of the statistical errors in the calculated flux with respect to the cross uncertainties, a simplified problem is considered, taking a constant neutron flux level and spectrum. It is shown that, provided that the flux statistical deviations in the Monte Carlo transport calculation do not exceed a given value, the effect of the flux errors in the calculated isotopic inventory are negligible (even at very high burn-up) compared to the effect of the large cross-section uncertainties available at present in the data files

  7. Propagation of statistical and nuclear data uncertainties in Monte Carlo burn-up calculations

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Herranz, Nuria [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid, UPM (Spain)], E-mail: nuria@din.upm.es; Cabellos, Oscar [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid, UPM (Spain); Sanz, Javier [Departamento de Ingenieria Energetica, Universidad Nacional de Educacion a Distancia, UNED (Spain); Juan, Jesus [Laboratorio de Estadistica, Universidad Politecnica de Madrid, UPM (Spain); Kuijper, Jim C. [NRG - Fuels, Actinides and Isotopes Group, Petten (Netherlands)

    2008-04-15

    Two methodologies to propagate the uncertainties on the nuclide inventory in combined Monte Carlo-spectrum and burn-up calculations are presented, based on sensitivity/uncertainty and random sampling techniques (uncertainty Monte Carlo method). Both enable the assessment of the impact of uncertainties in the nuclear data as well as uncertainties due to the statistical nature of the Monte Carlo neutron transport calculation. The methodologies are implemented in our MCNP-ACAB system, which combines the neutron transport code MCNP-4C and the inventory code ACAB. A high burn-up benchmark problem is used to test the MCNP-ACAB performance in inventory predictions, with no uncertainties. A good agreement is found with the results of other participants. This benchmark problem is also used to assess the impact of nuclear data uncertainties and statistical flux errors in high burn-up applications. A detailed calculation is performed to evaluate the effect of cross-section uncertainties in the inventory prediction, taking into account the temporal evolution of the neutron flux level and spectrum. Very large uncertainties are found at the unusually high burn-up of this exercise (800 MWd/kgHM). To compare the impact of the statistical errors in the calculated flux with respect to the cross uncertainties, a simplified problem is considered, taking a constant neutron flux level and spectrum. It is shown that, provided that the flux statistical deviations in the Monte Carlo transport calculation do not exceed a given value, the effect of the flux errors in the calculated isotopic inventory are negligible (even at very high burn-up) compared to the effect of the large cross-section uncertainties available at present in the data files.

  8. Methods library of embedded R functions at Statistics Norway

    Directory of Open Access Journals (Sweden)

    Øyvind Langsrud

    2017-11-01

    Full Text Available Statistics Norway is modernising the production processes. An important element in this work is a library of functions for statistical computations. In principle, the functions in such a methods library can be programmed in several languages. A modernised production environment demand that these functions can be reused for different statistics products, and that they are embedded within a common IT system. The embedding should be done in such a way that the users of the methods do not need to know the underlying programming language. As a proof of concept, Statistics Norway soon has established a methods library offering a limited number of methods for macro-editing, imputation and confidentiality. This is done within an area of municipal statistics with R as the only programming language. This paper presents the details and experiences from this work. The problem of fitting real word applications to simple and strict standards is discussed and exemplified by the development of solutions to regression imputation and table suppression.

  9. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Isa, M.F.M.. Vol 9, No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF · Vol 9, No 3S (2017): Special Issue - ...

  10. Polychronakos fractional statistics with a complex-valued parameter

    International Nuclear Information System (INIS)

    Rovenchak, Andrij

    2012-01-01

    A generalization of quantum statistics is proposed in a fashion similar to the suggestion of Polychronakos [Phys. Lett. B 365, 202 (1996)] with the parameter α varying between −1 (fermionic case) and +1 (bosonic case). However, unlike the original formulation, it is suggested that intermediate values are located on the unit circle in the complex plane. In doing so one can avoid the case α = 0 corresponding to the Boltzmann statistics, which is not a quantum one. The limits of α → +1 and α → −1 reproducing small deviations from the Bose and Fermi statistics, respectively, are studied in detail. The equivalence between the statistics parameter and a possible dissipative part of the excitation spectrum is established. The case of a non-conserving number of excitations is analyzed. It is defined from the condition that the real part of the chemical potential equals zero. Thermodynamic quantities of a model system of two-dimensional harmonic oscillators are calculated.

  11. Aerial gamma ray and magnetic survey: Powder River R and D Project, Arminto Detail, Wyoming. Final report

    International Nuclear Information System (INIS)

    1979-05-01

    The small detail area, 18 miles by 18 miles, lying near the center of the Powder River Basin, is covered entirely by sediments of the Eocene Wasatch Formation. Historically economic uranium deposits have been worked in the southeast corner of the area which includes the northern extremity of the Pumpkin Buttes district. 127 statistical uranium anomalies were generated for the study area, based on area wide statistics

  12. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  13. Detailed services in a spatial data infrastructure from the computation viewpoint

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2009-11-01

    Full Text Available the detailed services that are performed within each of these components, and the roles played by these components in the different phases of establishing and using an SDI. The matrix of these detailed services is too large for inclusion in this conference...

  14. Excel 2016 for social work statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2017-01-01

    This text is a step-by-step guide for students taking a first course in statistics for social work and for social work managers and practitioners who want to learn how to use Excel to solve practical statistics problems in in the workplace, whether or not they have taken a course in statistics. There is no other text for a first course in social work statistics that teaches students, step-by-step, how to use Excel to solve interesting social work statistics problems. Excel 2016 for Social Work Statistics explains statistical formulas and offers practical examples for how students can solve real-world social work statistics problems. This book leaves detailed explanations of statistical theory to other statistics textbooks and focuses entirely on practical, real-world problem solving. Each chapter briefly explains a topic and then demonstrates how to use Excel commands and formulas to solve specific social work statistics problems.  This book gives practice in using Excel in two different ways:  (1) writing ...

  15. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Science.gov (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  16. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  17. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Abstract PDF · Vol 3, No 6 (2011) - Articles Mixed convection flow and heat transfer in a vertical wavy channel containing porous and fluid layer with traveling thermal waves. Abstract PDF · Vol 3, No 8 ...

  18. Statistical nuclear reactions

    International Nuclear Information System (INIS)

    Hilaire, S.

    2001-01-01

    A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)

  19. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  20. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  1. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    Science.gov (United States)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  2. Applied statistics for agriculture, veterinary, fishery, dairy and allied fields

    CERN Document Server

    Sahu, Pradip Kumar

    2016-01-01

    This book is aimed at a wide range of readers who lack confidence in the mathematical and statistical sciences, particularly in the fields of Agriculture, Veterinary, Fishery, Dairy and other related areas. Its goal is to present the subject of statistics and its useful tools in various disciplines in such a manner that, after reading the book, readers will be equipped to apply the statistical tools to extract otherwise hidden information from their data sets with confidence. Starting with the meaning of statistics, the book introduces measures of central tendency, dispersion, association, sampling methods, probability, inference, designs of experiments and many other subjects of interest in a step-by-step and lucid manner. The relevant theories are described in detail, followed by a broad range of real-world worked-out examples, solved either manually or with the help of statistical packages. In closing, the book also includes a chapter on which statistical packages to use, depending on the user’s respecti...

  3. Planck 2013 results. XXIII. Isotropy and Statistics of the CMB

    DEFF Research Database (Denmark)

    Planck Collaboration,; Ade, P. A. R.; Aghanim, N.

    2013-01-01

    The two fundamental assumptions of the standard cosmological model - that the initial fluctuations are statistically isotropic and Gaussian - are rigorously tested using maps of the CMB anisotropy from the \\Planck\\ satellite. The detailed results are based on studies of four independent estimates...

  4. Evaluation of the performance of Moses statistical engine adapted to ...

    African Journals Online (AJOL)

    ... of Moses statistical engine adapted to English-Arabic language combination. ... of Artificial Intelligence (AI) dedicated to Natural Language Processing (NLP). ... and focuses on SMT, then introducing the features of the open source Moses ...

  5. Review of Statistical Analyses Resulting from Performance of HLDWD-DWPF-005

    International Nuclear Information System (INIS)

    Beck, R.S.

    1997-01-01

    The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1 ampersand 2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of

  6. FREQFIT: Computer program which performs numerical regression and statistical chi-squared goodness of fit analysis

    International Nuclear Information System (INIS)

    Hofland, G.S.; Barton, C.C.

    1990-01-01

    The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program's results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig

  7. Statistical analyses of variability/reproducibility of environmentally assisted cyclic crack growth rate data utilizing JAERI Material Performance Database (JMPD)

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo

    1993-05-01

    Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type ones from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air. (author)

  8. Results of Detailed Hydrologic Characterization Tests—Fiscal and Calendar Year 2005

    Energy Technology Data Exchange (ETDEWEB)

    Spane, Frank A.; Newcomer, Darrell R.

    2008-02-27

    This report provides the results of detailed hydrologic characterization tests conducted within selected Hanford Site wells during fiscal and calendar year 2005. Detailed characterization tests performed included groundwater-flow characterization, barometric response evaluation, slug tests, in-well vertical groundwater-flow assessments, and a single-well tracer and constant-rate pumping test. Hydraulic property estimates obtained from the detailed hydrologic tests include hydraulic conductivity, transmissivity, specific yield, effective porosity, in-well lateral and vertical groundwater-flow velocity, aquifer groundwater-flow velocity, and depth-distribution profiles of hydraulic conductivity. In addition, local groundwater-flow characteristics (i.e., hydraulic gradient and flow direction) were determined for a site where detailed well testing was performed. Results obtained from these tests provide hydrologic information that supports the needs of Resource Conservation and Recovery Act waste management area characterization as well as sitewide groundwater monitoring and modeling programs. These results also reduce the uncertainty of groundwater-flow conditions at selected locations on the Hanford Site.

  9. Optimization of the gas turbine-modular helium reactor using statistical methods to maximize performance without compromising system design margins

    International Nuclear Information System (INIS)

    Lommers, L.J.; Parme, L.L.; Shenoy, A.S.

    1995-07-01

    This paper describes a statistical approach for determining the impact of system performance and design uncertainties on power plant performance. The objectives of this design approach are to ensure that adequate margin is provided, that excess margin is minimized, and that full advantage can be taken of unconsumed margin. It is applicable to any thermal system in which these factors are important. The method is demonstrated using the Gas Turbine Modular Helium Reactor as an example. The quantitative approach described allows the characterization of plant performance and the specification of the system design requirements necessary to achieve the desired performance with high confidence. Performance variations due to design evolution, inservice degradation, and basic performance uncertainties are considered. The impact of all performance variabilities is combined using Monte Carlo analysis to predict the range of expected operation

  10. Stable statistical representations facilitate visual search.

    Science.gov (United States)

    Corbett, Jennifer E; Melcher, David

    2014-10-01

    Observers represent the average properties of object ensembles even when they cannot identify individual elements. To investigate the functional role of ensemble statistics, we examined how modulating statistical stability affects visual search. We varied the mean and/or individual sizes of an array of Gabor patches while observers searched for a tilted target. In "stable" blocks, the mean and/or local sizes of the Gabors were constant over successive displays, whereas in "unstable" baseline blocks they changed from trial to trial. Although there was no relationship between the context and the spatial location of the target, observers found targets faster (as indexed by faster correct responses and fewer saccades) as the global mean size became stable over several displays. Building statistical stability also facilitated scanning the scene, as measured by larger saccadic amplitudes, faster saccadic reaction times, and shorter fixation durations. These findings suggest a central role for peripheral visual information, creating context to free resources for detailed processing of salient targets and maintaining the illusion of visual stability.

  11. The use of statistics in real and simulated investigations performed by undergraduate health sciences' students

    OpenAIRE

    Pimenta, Rui; Nascimento, Ana; Vieira, Margarida; Costa, Elísio

    2010-01-01

    In previous works, we evaluated the statistical reasoning ability acquired by health sciences’ students carrying out their final undergraduate project. We found that these students achieved a good level of statistical literacy and reasoning in descriptive statistics. However, concerning inferential statistics the students did not reach a similar level. Statistics educators therefore claim for more effective ways to learn statistics such as project based investigations. These can be simulat...

  12. Effect of the Target Motion Sampling temperature treatment method on the statistics and performance

    International Nuclear Information System (INIS)

    Viitanen, Tuomas; Leppänen, Jaakko

    2015-01-01

    Highlights: • Use of the Target Motion Sampling (TMS) method with collision estimators is studied. • The expected values of the estimators agree with NJOY-based reference. • In most practical cases also the variances of the estimators are unaffected by TMS. • Transport calculation slow-down due to TMS dominates the impact on figures-of-merit. - Abstract: Target Motion Sampling (TMS) is a stochastic on-the-fly temperature treatment technique that is being developed as a part of the Monte Carlo reactor physics code Serpent. The method provides for modeling of arbitrary temperatures in continuous-energy Monte Carlo tracking routines with only one set of cross sections stored in the computer memory. Previously, only the performance of the TMS method in terms of CPU time per transported neutron has been discussed. Since the effective cross sections are not calculated at any point of a transport simulation with TMS, reaction rate estimators must be scored using sampled cross sections, which is expected to increase the variances and, consequently, to decrease the figures-of-merit. This paper examines the effects of the TMS on the statistics and performance in practical calculations involving reaction rate estimation with collision estimators. Against all expectations it turned out that the usage of sampled response values has no practical effect on the performance of reaction rate estimators when using TMS with elevated basis cross section temperatures (EBT), i.e. the usual way. With 0 Kelvin cross sections a significant increase in the variances of capture rate estimators was observed right below the energy region of unresolved resonances, but at these energies the figures-of-merit could be increased using a simple resampling technique to decrease the variances of the responses. It was, however, noticed that the usage of the TMS method increases the statistical deviances of all estimators, including the flux estimator, by tens of percents in the vicinity of very

  13. Full counting statistics of multiple Andreev reflections in incoherent diffusive superconducting junctions

    International Nuclear Information System (INIS)

    Samuelsson, P.

    2007-01-01

    We present a theory for the full distribution of current fluctuations in incoherent diffusive superconducting junctions, subjected to a voltage bias. This theory of full counting statistics of incoherent multiple Andreev reflections is valid for an arbitrary applied voltage. We present a detailed discussion of the properties of the first four cumulants as well as the low and high voltage regimes of the full counting statistics. (orig.)

  14. A new formalism for non extensive physical systems: Tsallis Thermo statistics

    International Nuclear Information System (INIS)

    Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.

    1999-01-01

    Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line

  15. Statistical learning in social action contexts.

    Science.gov (United States)

    Monroy, Claire; Meyer, Marlene; Gerson, Sarah; Hunnius, Sabine

    2017-01-01

    Sensitivity to the regularities and structure contained within sequential, goal-directed actions is an important building block for generating expectations about the actions we observe. Until now, research on statistical learning for actions has solely focused on individual action sequences, but many actions in daily life involve multiple actors in various interaction contexts. The current study is the first to investigate the role of statistical learning in tracking regularities between actions performed by different actors, and whether the social context characterizing their interaction influences learning. That is, are observers more likely to track regularities across actors if they are perceived as acting jointly as opposed to in parallel? We tested adults and toddlers to explore whether social context guides statistical learning and-if so-whether it does so from early in development. In a between-subjects eye-tracking experiment, participants were primed with a social context cue between two actors who either shared a goal of playing together ('Joint' condition) or stated the intention to act alone ('Parallel' condition). In subsequent videos, the actors performed sequential actions in which, for certain action pairs, the first actor's action reliably predicted the second actor's action. We analyzed predictive eye movements to upcoming actions as a measure of learning, and found that both adults and toddlers learned the statistical regularities across actors when their actions caused an effect. Further, adults with high statistical learning performance were sensitive to social context: those who observed actors with a shared goal were more likely to correctly predict upcoming actions. In contrast, there was no effect of social context in the toddler group, regardless of learning performance. These findings shed light on how adults and toddlers perceive statistical regularities across actors depending on the nature of the observed social situation and the

  16. Blindness to a simultaneous change of all elements in a scene, unless there is a change in summary statistics.

    Science.gov (United States)

    Saiki, Jun; Holcombe, Alex O

    2012-03-06

    Sudden change of every object in a display is typically conspicuous. We find however that in the presence of a secondary task, with a display of moving dots, it can be difficult to detect a sudden change in color of all the dots. A field of 200 dots, half red and half green, half moving rightward and half moving leftward, gave the appearance of two surfaces. When all 200 dots simultaneously switched color between red and green, performance in detecting the switch was very poor. A key display characteristic was that the color proportions on each surface (summary statistics) were not affected by the color switch. When the color switch is accompanied by a change in these summary statistics, people perform well in detecting the switch, suggesting that the secondary task does not disrupt the availability of this statistical information. These findings suggest that when the change is missed, the old and new colors were represented, but the color-location pattern (binding of colors to locations) was not represented or not compared. Even after extended viewing, changes to the individual color-location pattern are not available, suggesting that the feeling of seeing these details is misleading.

  17. Nuclear medicine statistics

    International Nuclear Information System (INIS)

    Martin, P.M.

    1977-01-01

    Numerical description of medical and biologic phenomena is proliferating. Laboratory studies on patients now yield measurements of at least a dozen indices, each with its own normal limits. Within nuclear medicine, numerical analysis as well as numerical measurement and the use of computers are becoming more common. While the digital computer has proved to be a valuable tool for measurment and analysis of imaging and radioimmunoassay data, it has created more work in that users now ask for more detailed calculations and for indices that measure the reliability of quantified observations. The following material is presented with the intention of providing a straight-forward methodology to determine values for some useful parameters and to estimate the errors involved. The process used is that of asking relevant questions and then providing answers by illustrations. It is hoped that this will help the reader avoid an error of the third kind, that is, the error of statistical misrepresentation or inadvertent deception. This occurs most frequently in cases where the right answer is found to the wrong question. The purposes of this chapter are: (1) to provide some relevant statistical theory, using a terminology suitable for the nuclear medicine field; (2) to demonstrate the application of a number of statistical methods to the kinds of data commonly encountered in nuclear medicine; (3) to provide a framework to assist the experimenter in choosing the method and the questions most suitable for the experiment at hand; and (4) to present a simple approach for a quantitative quality control program for scintillation cameras and other radiation detectors

  18. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  19. Woods and Russell, Hill, and the emergence of medical statistics.

    Science.gov (United States)

    Farewell, Vern; Johnson, Tony

    2010-06-30

    In 1937, Austin Bradford Hill wrote Principles of Medical Statistics (Lancet: London, 1937) that became renowned throughout the world and is widely associated with the birth of modern medical statistics. Some 6 years earlier Hilda Mary Woods and William Thomas Russell, colleagues of Hill at the London School of Hygiene and Tropical Medicine, wrote a similar book An Introduction to Medical Statistics (PS King and Son: London, 1931) that is little known today. We trace the origins of these two books from the foundations of early demography and vital statistics, and make a detailed examination of some of their chapters. It is clear that these texts mark a watershed in the history of medical statistics that demarcates the vital statistics of the nineteenth and early twentieth centuries from the modern discipline. Moreover, we consider that the book by Woods and Russell is of some importance in the development of medical statistics and we describe and acknowledge their place in the history of this discipline. (c) 2010 John Wiley & Sons, Ltd.

  20. Effect of altitude on physiological performance: a statistical analysis using results of international football games.

    Science.gov (United States)

    McSharry, Patrick E

    2007-12-22

    To assess the effect of altitude on match results and physiological performance of a large and diverse population of professional athletes. Statistical analysis of international football (soccer) scores and results. FIFA extensive database of 1460 football matches in 10 countries spanning over 100 years. Altitude had a significant (Pnegative impact on physiological performance as revealed through the overall underperformance of low altitude teams when playing against high altitude teams in South America. High altitude teams score more and concede fewer goals with increasing altitude difference. Each additional 1000 m of altitude difference increases the goal difference by about half of a goal. The probability of the home team winning for two teams from the same altitude is 0.537, whereas this rises to 0.825 for a home team with an altitude difference of 3695 m (such as Bolivia v Brazil) and falls to 0.213 when the altitude difference is -3695 m (such as Brazil v Bolivia). Altitude provides a significant advantage for high altitude teams when playing international football games at both low and high altitudes. Lowland teams are unable to acclimatise to high altitude, reducing physiological performance. As physiological performance does not protect against the effect of altitude, better predictors of individual susceptibility to altitude illness would facilitate team selection.

  1. Performance/Design Requirements and Detailed Technical Description for a Computer-Directed Training Subsystem for Integration into the Air Force Phase II Base Level System.

    Science.gov (United States)

    Butler, A. K.; And Others

    The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…

  2. Northern Ireland annual abstract of statistics: no. 9

    International Nuclear Information System (INIS)

    1990-01-01

    These 1990 statistics from the Policy Planning and Research Unit at Stormont, which form part of a larger non-nuclear collection, detail radioactive contamination of fish and seaweeds in the Irish Sea, in terms of the concentrations of K 40 , Cs 134 and Cs 137 in Becquerels per kilogram. Gamma doses in intertidal sediments of sand and mud are also recorded. (UK)

  3. The Physical Models and Statistical Procedures Used in the RACER Monte Carlo Code

    International Nuclear Information System (INIS)

    Sutton, T.M.; Brown, F.B.; Bischoff, F.G.; MacMillan, D.B.; Ellis, C.L.; Ward, J.T.; Ballinger, C.T.; Kelly, D.J.; Schindler, L.

    1999-01-01

    capability of performing iterated-source (criticality), multiplied-fixed-source, and fixed-source calculations. MCV uses a highly detailed continuous-energy (as opposed to multigroup) representation of neutron histories and cross section data. The spatial modeling is fully three-dimensional (3-D), and any geometrical region that can be described by quadric surfaces may be represented. The primary results are region-wise reaction rates, neutron production rates, slowing-down-densities, fluxes, leakages, and when appropriate the eigenvalue or multiplication factor. Region-wise nuclidic reaction rates are also computed, which may then be used by other modules in the system to determine time-dependent nuclide inventories so that RACER can perform depletion calculations. Furthermore, derived quantities such as ratios and sums of primary quantities and/or other derived quantities may also be calculated. MCV performs statistical analyses on output quantities, computing estimates of the 95% confidence intervals as well as indicators as to the reliability of these estimates. The remainder of this chapter provides an overview of the MCV algorithm. The following three chapters describe the MCV mathematical, physical, and statistical treatments in more detail. Specifically, Chapter 2 discusses topics related to tracking the histories including: geometry modeling, how histories are moved through the geometry, and variance reduction techniques related to the tracking process. Chapter 3 describes the nuclear data and physical models employed by MCV. Chapter 4 discusses the tallies, statistical analyses, and edits. Chapter 5 provides some guidance as to how to run the code, and Chapter 6 is a list of the code input options

  4. Introduction to mathematical statistical physics

    CERN Document Server

    Minlos, R A

    1999-01-01

    This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...

  5. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics

    Directory of Open Access Journals (Sweden)

    Haejoon Jung

    2018-01-01

    Full Text Available As an intrinsic part of the Internet of Things (IoT ecosystem, machine-to-machine (M2M communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  6. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics.

    Science.gov (United States)

    Jung, Haejoon; Lee, In-Ho

    2018-01-12

    As an intrinsic part of the Internet of Things (IoT) ecosystem, machine-to-machine (M2M) communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave) communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC) device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs) with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy) of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  7. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  8. Multimodal integration in statistical learning

    DEFF Research Database (Denmark)

    Mitchell, Aaron; Christiansen, Morten Hyllekvist; Weiss, Dan

    2014-01-01

    , we investigated the ability of adults to integrate audio and visual input during statistical learning. We presented learners with a speech stream synchronized with a video of a speaker’s face. In the critical condition, the visual (e.g., /gi/) and auditory (e.g., /mi/) signals were occasionally...... facilitated participants’ ability to segment the speech stream. Our results therefore demonstrate that participants can integrate audio and visual input to perceive the McGurk illusion during statistical learning. We interpret our findings as support for modality-interactive accounts of statistical learning.......Recent advances in the field of statistical learning have established that learners are able to track regularities of multimodal stimuli, yet it is unknown whether the statistical computations are performed on integrated representations or on separate, unimodal representations. In the present study...

  9. Mathematical methods in quantum and statistical mechanics

    International Nuclear Information System (INIS)

    Fishman, L.

    1977-01-01

    The mathematical structure and closed-form solutions pertaining to several physical problems in quantum and statistical mechanics are examined in some detail. The J-matrix method, introduced previously for s-wave scattering and based upon well-established Hilbert Space theory and related generalized integral transformation techniques, is extended to treat the lth partial wave kinetic energy and Coulomb Hamiltonians within the context of square integrable (L 2 ), Laguerre (Slater), and oscillator (Gaussian) basis sets. The theory of relaxation in statistical mechanics within the context of the theory of linear integro-differential equations of the Master Equation type and their corresponding Markov processes is examined. Several topics of a mathematical nature concerning various computational aspects of the L 2 approach to quantum scattering theory are discussed

  10. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  11. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-01-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the resutls on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monople giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excelent agreement with recent experimental data, showing that the decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  12. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-02-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the results on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monopole giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excellent agreement with recent experimental data, showing that decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  13. Swiss energy statistics 2010

    International Nuclear Information System (INIS)

    2011-01-01

    This comprehensive report presents the Swiss Federal Office of Energy's statistics on energy production and consumption in Switzerland in 2010. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2010 and energy use in various sectors are presented. The Swiss energy balance with reference to the use of renewable sources of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. In the third chapter, details are given related to each energy carrier. The final chapter deals with economical and environmental aspects

  14. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and

  15. Mathematical and statistical applications in life sciences and engineering

    CERN Document Server

    Adhikari, Mahima; Chaubey, Yogendra

    2017-01-01

    The book includes articles from eminent international scientists discussing a wide spectrum of topics of current importance in mathematics and statistics and their applications. It presents state-of-the-art material along with a clear and detailed review of the relevant topics and issues concerned. The topics discussed include message transmission, colouring problem, control of stochastic structures and information dynamics, image denoising, life testing and reliability, survival and frailty models, analysis of drought periods, prediction of genomic profiles, competing risks, environmental applications and chronic disease control. It is a valuable resource for researchers and practitioners in the relevant areas of mathematics and statistics.

  16. The statistical chopper in the time-of-flight technique

    International Nuclear Information System (INIS)

    Albuquerque Vieira, J. de.

    1975-12-01

    A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt

  17. Game Indicators Determining Sports Performance in the NBA.

    Science.gov (United States)

    Mikołajec, Kazimierz; Maszczyk, Adam; Zając, Tomasz

    2013-01-01

    The main goal of the present study was to identify basketball game performance indicators which best determine sports level in the National Basketball Association (NBA) league. The research material consisted of all NBA game statistics at the turn of eight seasons (2003-11) and included 52 performance variables. Through detailed analysis the variables with high influence on game effectiveness were selected for final procedures. It has been proven that a limited number of factors, mostly offensive, determines sports performance in the NBA. The most critical indicators are: Win%, Offensive EFF, 3rd Quarter PPG, Win% CG, Avg Fauls and Avg Steals. In practical applications these results connected with top teams and elite players may help coaches to design better training programs.

  18. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  19. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  20. Statistical significance estimation of a signal within the GooFit framework on GPUs

    Directory of Open Access Journals (Sweden)

    Cristella Leonardo

    2017-01-01

    Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  1. CMHC research project: Testing of air barrier construction details, II: Report

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    Air leakage control through the building envelope of wood framed houses is more important than ever. The leakage of air is controlled by the air barrier system. There are several new technologies to construct an air barrier system for the building envelope. These are the Poly Approach, the Air Drywall Approach and the EASE system. The development of these systems was undertaken primarily by the building community without significant research and development. The purpose of this study was to determine the actual performance of several different types of construction details for each of the different approaches. Each of these details was designed and constructed using one of the air barrier methods and tested in the laboratory. The test details included the sill plate, the partition wall, the stair stringer, the electrical outlets, the bathtub detail, the plumbing stack detail, the metal chimney detail, the bathroom fan detail and the EASE wall system.

  2. Performance Monitoring System: Summary of Lock Statistics. Revision 1.

    Science.gov (United States)

    1985-12-01

    2751 84 4057 4141 526 798 18 1342 5727 19 5523 3996 4587 8583 1056 1630 35 2721 6536LOCK A DAMI 2 AUXILIARY I Ins NO DATA RECORDD FOR THIS LOCK- " LOCK I...TOTAL (KTOMS) ’ - (AVt OPNP ETC) ’’ ,q [ " ARKANSAS RIVER "" FORRELL LOCK IP 7A/3TRC 9/N83 UPBOUID STATISTICS ISO 53 42 M6 553 356 909 221 41 21 M8

  3. Statistical handbook for Canada's upstream petroleum industry: '96 updates

    International Nuclear Information System (INIS)

    1997-01-01

    The Statistical Handbook of CAPP is an annual compilation of useful information about the Canadian petroleum and natural gas industry. It has been published since 1955, and is a key source of upstream petroleum statistics. It presents a historical summary of the petroleum industry''s progress and provides detailed statistical information on the production and consumption of petroleum, petroleum products, natural gas and natural gas liquids, imports and exports, land sales, pipelines, reserves, drilling and refinery activities, and prices in Canada. The information, mostly in tabular form, is based on the latest available data (generally up to and including 1996). For the first time in 1997, the Handbook is also made available in CD-ROM format (EXCEL 5.0). Plans are also underway to publish the Handbook on a secure site on the Internet

  4. Extreme event statistics in a drifting Markov chain

    Science.gov (United States)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  5. Statistical properties of chaotic dynamical systems which exhibit strange attractors

    International Nuclear Information System (INIS)

    Jensen, R.V.; Oberman, C.R.

    1981-07-01

    A path integral method is developed for the calculation of the statistical properties of turbulent dynamical systems. The method is applicable to conservative systems which exhibit a transition to stochasticity as well as dissipative systems which exhibit strange attractors. A specific dissipative mapping is considered in detail which models the dynamics of a Brownian particle in a wave field with a broad frequency spectrum. Results are presented for the low order statistical moments for three turbulent regimes which exhibit strange attractors corresponding to strong, intermediate, and weak collisional damping

  6. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    Science.gov (United States)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  7. Forecasting of a ground-coupled heat pump performance using neural networks with statistical data weighting pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Esen, Hikmet; Esen, Mehmet [Department of Mechanical Education, Faculty of Technical Education, Firat University, 23119 Elazig (Turkey); Inalli, Mustafa [Department of Mechanical Engineering, Faculty of Engineering, Firat University, 23279 Elazig (Turkey); Sengur, Abdulkadir [Department of Electronic and Computer Science, Faculty of Technical Education, Firat University, 23119 Elazig (Turkey)

    2008-04-15

    The objective of this work is to improve the performance of an artificial neural network (ANN) with a statistical weighted pre-processing (SWP) method to learn to predict ground source heat pump (GCHP) systems with the minimum data set. Experimental studies were completed to obtain training and test data. Air temperatures entering/leaving condenser unit, water-antifreeze solution entering/leaving the horizontal ground heat exchangers and ground temperatures (1 and 2 m) were used as input layer, while the output is coefficient of performance (COP) of system. Some statistical methods, such as the root-mean squared (RMS), the coefficient of multiple determinations (R{sup 2}) and the coefficient of variation (cov) is used to compare predicted and actual values for model validation. It is found that RMS value is 0.074, R{sup 2} value is 0.9999 and cov value is 2.22 for SCG6 algorithm of only ANN structure. It is also found that RMS value is 0.002, R{sup 2} value is 0.9999 and cov value is 0.076 for SCG6 algorithm of SWP-ANN structure. The simulation results show that the SWP based networks can be used an alternative way in these systems. Therefore, instead of limited experimental data found in literature, faster and simpler solutions are obtained using hybridized structures such as SWP-ANN. (author)

  8. STATISTICAL EVALUATION OF SMALL SCALE MIXING DEMONSTRATION SAMPLING AND BATCH TRANSFER PERFORMANCE - 12093

    Energy Technology Data Exchange (ETDEWEB)

    GREER DA; THIEN MG

    2012-01-12

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The

  9. Applying Statistical Mechanics to pixel detectors

    International Nuclear Information System (INIS)

    Pindo, Massimiliano

    2002-01-01

    Pixel detectors, being made of a large number of active cells of the same kind, can be considered as significant sets to which Statistical Mechanics variables and methods can be applied. By properly redefining well known statistical parameters in order to let them match the ones that actually characterize pixel detectors, an analysis of the way they work can be performed in a totally new perspective. A deeper understanding of pixel detectors is attained, helping in the evaluation and comparison of their intrinsic characteristics and performance

  10. The quantum theory of statistical multistep nucleus reactions

    CERN Document Server

    Zhivopistsev, F A

    2002-01-01

    The phenomenological models and quantum approaches to the description of the statistical multistep nuclear reactions are discussed. The basic advantages and deficiencies of various modifications of the quantum theory of the statistical multistep direct reactions: Feshbach-Kerman-Koonin formalism, the generalized model of the statistical multistep reactions (GMSMR) are considered in detail. The possibility of obtaining the consistent description of the experimental spectra for the reactions with nucleons is shown by the particular examples. Further improvement and development of the quantum formalism for the more complete and consecutive description of various mechanisms of the component particle formalism in the output channel, the correct of the unbound state densities of the intermediate and finite nuclei are needed for the analysis of the inclusive reactions with participation of the component particles, (and with an account of the contributions to the cross sections of the nucleus cluster and shell areas)...

  11. Development of a statistical shape model of multi-organ and its performance evaluation

    International Nuclear Information System (INIS)

    Nakada, Misaki; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2010-01-01

    Existing statistical shape modeling methods for an organ can not take into account the correlation between neighboring organs. This study focuses on a level set distribution model and proposes two modeling methods for multiple organs that can take into account the correlation between neighboring organs. The first method combines level set functions of multiple organs into a vector. Subsequently it analyses the distribution of the vectors of a training dataset by a principal component analysis and builds a multiple statistical shape model. Second method constructs a statistical shape model for each organ independently and assembles component scores of different organs in a training dataset so as to generate a vector. It analyses the distribution of the vectors of to build a statistical shape model of multiple organs. This paper shows results of applying the proposed methods trained by 15 abdominal CT volumes to unknown 8 CT volumes. (author)

  12. Financial statistics of major publicly owned electric utilities, 1991

    International Nuclear Information System (INIS)

    1993-01-01

    The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues

  13. Financial statistics of major publicly owned electric utilities, 1991

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-31

    The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues.

  14. Statistical properties of laser light scattering in Brownian medium

    International Nuclear Information System (INIS)

    Suwono; Santoso, Budi; Baiquni, A.

    1983-01-01

    Relationship between statistical properties of laser light scattering in Brownian medium and photon-counting distributions are described in detail. A coherence optical detection has been constructed and by using photon-counting technique the ensemble distribution of the scattered field within space and time coherence has been measured. Good agreement between theory and experiment is shown. (author)

  15. A performance study on the synchronisation of heterogeneous Grid databases using CONStanza

    CERN Document Server

    Pucciani, G; Domenici, Andrea; Stockinger, Heinz

    2010-01-01

    In Grid environments, several heterogeneous database management systems are used in various administrative domains. However, data exchange and synchronisation need to be available across different sites and different database systems. In this article we present our data consistency service CONStanza and give details on how we achieve relaxed update synchronisation between different database implementations. The integration in existing Grid environments is one of the major goals of the system. Performance tests have been executed following a factorial approach. Detailed experimental results and a statistical analysis are presented to evaluate the system components and drive future developments. (C) 2010 Elsevier B.V. All rights reserved.

  16. Curve fitting and modeling with splines using statistical variable selection techniques

    Science.gov (United States)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  17. Photo-Realistic Statistical Skull Morphotypes: New Exemplars for Ancestry and Sex Estimation in Forensic Anthropology.

    Science.gov (United States)

    Caple, Jodi; Stephan, Carl N

    2017-05-01

    Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.

  18. Detail study of SiC MOSFET switching characteristics

    DEFF Research Database (Denmark)

    Li, Helong; Munk-Nielsen, Stig

    2014-01-01

    This paper makes detail study of the latest SiC MOSFETs switching characteristics in relation to gate driver maximum current, gate resistance, common source inductance and parasitic switching loop inductance. The switching performance of SiC MOSFETs in terms of turn on and turn off voltage...

  19. A Survey of Statistical Capstone Projects

    Science.gov (United States)

    Martonosi, Susan E.; Williams, Talithia D.

    2016-01-01

    In this article, we highlight the advantages of incorporating a statistical capstone experience in the undergraduate curriculum, where students perform an in-depth analysis of real-world data. Capstone experiences develop statistical thinking by allowing students to engage in a consulting-like experience that requires skills outside the scope of…

  20. Statistical Data Editing in Scientific Articles.

    Science.gov (United States)

    Habibzadeh, Farrokh

    2017-07-01

    Scientific journals are important scholarly forums for sharing research findings. Editors have important roles in safeguarding standards of scientific publication and should be familiar with correct presentation of results, among other core competencies. Editors do not have access to the raw data and should thus rely on clues in the submitted manuscripts. To identify probable errors, they should look for inconsistencies in presented results. Common statistical problems that can be picked up by a knowledgeable manuscript editor are discussed in this article. Manuscripts should contain a detailed section on statistical analyses of the data. Numbers should be reported with appropriate precisions. Standard error of the mean (SEM) should not be reported as an index of data dispersion. Mean (standard deviation [SD]) and median (interquartile range [IQR]) should be used for description of normally and non-normally distributed data, respectively. If possible, it is better to report 95% confidence interval (CI) for statistics, at least for main outcome variables. And, P values should be presented, and interpreted with caution, if there is a hypothesis. To advance knowledge and skills of their members, associations of journal editors are better to develop training courses on basic statistics and research methodology for non-experts. This would in turn improve research reporting and safeguard the body of scientific evidence. © 2017 The Korean Academy of Medical Sciences.

  1. Modern applied statistics with S-plus

    CERN Document Server

    Venables, W N

    1994-01-01

    S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.

  2. Heuristic versus statistical physics approach to optimization problems

    International Nuclear Information System (INIS)

    Jedrzejek, C.; Cieplinski, L.

    1995-01-01

    Optimization is a crucial ingredient of many calculation schemes in science and engineering. In this paper we assess several classes of methods: heuristic algorithms, methods directly relying on statistical physics such as the mean-field method and simulated annealing; and Hopfield-type neural networks and genetic algorithms partly related to statistical physics. We perform the analysis for three types of problems: (1) the Travelling Salesman Problem, (2) vector quantization, and (3) traffic control problem in multistage interconnection network. In general, heuristic algorithms perform better (except for genetic algorithms) and much faster but have to be specific for every problem. The key to improving the performance could be to include heuristic features into general purpose statistical physics methods. (author)

  3. South African coal statistics 2006. Marketing manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-08-15

    The report shows that South African thermal exports increased 5% from 66.6Mt to 69.9Mt in 2005 and that the country was the world's third largest seaborne exporter of thermal coal last year. Covering local coal consumption, South African coal imports, exports, prices and qualities, the report offers a complete statistical review of 2005. The report also includes details on labour, individual collieries, export and rail infrastructure and Black Empowerment (BEE) companies.

  4. The statistical mechanics of financial markets

    CERN Document Server

    Voit, Johannes

    2003-01-01

    From the reviews of the first edition - "Provides an excellent introduction for physicists interested in the statistical properties of financial markets. Appropriately early in the book the basic financial terms such as shorts, limit orders, puts, calls, and other terms are clearly defined. Examples, often with graphs, augment the reader’s understanding of what may be a plethora of new terms and ideas… [This is] an excellent starting point for the physicist interested in the subject. Some of the book’s strongest features are its careful definitions, its detailed examples, and the connection it establishes to physical systems." PHYSICS TODAY "This book is excellent at illustrating the similarities of financial markets with other non-equilibrium physical systems. [...] In summary, a very good book that offers more than just qualitative comparisons of physics and finance." (www.quantnotes.com) This highly-praised introductory treatment describes parallels between statistical physics and finance - both thos...

  5. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  6. Analysis of Detailed Energy Audits and Energy Use Measures of University Buildings

    Directory of Open Access Journals (Sweden)

    Kęstutis Valančius

    2011-12-01

    Full Text Available The paper explains the results of a detailed energy audit of the buildings of Vilnius Gediminas Technical University. Energy audits were performed with reference to the international scientific project. The article presents the methodology and results of detailed measurements of energy balance characteristics.Article in Lithuanian

  7. Energy statistics. France; Statistiques energetiques. France

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    This document summarizes in a series of tables the energy statistical data for France: consumption since 1973; energy supplies (production, imports, exports, stocks) and uses (refining, power production, internal uses, sectoral consumption) for coal, petroleum, gas, electricity, and renewable energy sources; national production and consumption of primary energy; final consumption per sector and per energy source; general indicators (energy bill, US$ change rate, prices, energy independence, internal gross product); projections. Details (resources, uses, prices, imports, internal consumption) are given separately for petroleum, natural gas, electric power and solid mineral fuels. (J.S.)

  8. Statistical mechanics of neocortical interactions: Path-integral evolution of short-term memory

    Science.gov (United States)

    Ingber, Lester

    1994-05-01

    Previous papers in this series of statistical mechanics of neocortical interactions (SMNI) have detailed a development from the relatively microscopic scales of neurons up to the macroscopic scales as recorded by electroencephalography (EEG), requiring an intermediate mesocolumnar scale to be developed at the scale of minicolumns (~=102 neurons) and macrocolumns (~=105 neurons). Opportunity was taken to view SMNI as sets of statistical constraints, not necessarily describing specific synaptic or neuronal mechanisms, on neuronal interactions, on some aspects of short-term memory (STM), e.g., its capacity, stability, and duration. A recently developed c-language code, pathint, provides a non-Monte Carlo technique for calculating the dynamic evolution of arbitrary-dimension (subject to computer resources) nonlinear Lagrangians, such as derived for the two-variable SMNI problem. Here, pathint is used to explicitly detail the evolution of the SMNI constraints on STM.

  9. The statistical-inference approach to generalized thermodynamics

    International Nuclear Information System (INIS)

    Lavenda, B.H.; Scherer, C.

    1987-01-01

    Limit theorems, such as the central-limit theorem and the weak law of large numbers, are applicable to statistical thermodynamics for sufficiently large sample size of indipendent and identically distributed observations performed on extensive thermodynamic (chance) variables. The estimation of the intensive thermodynamic quantities is a problem in parametric statistical estimation. The normal approximation to the Gibbs' distribution is justified by the analysis of large deviations. Statistical thermodynamics is generalized to include the statistical estimation of variance as well as mean values

  10. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  11. Detailed climate-change projections for urban land-use change and green-house gas increases for Belgium with COSMO-CLM coupled to TERRA_URB

    Science.gov (United States)

    Wouters, Hendrik; Vanden Broucke, Sam; van Lipzig, Nicole; Demuzere, Matthias

    2016-04-01

    Recent research clearly show that climate modelling at high resolution - which resolve the deep convection, the detailed orography and land-use including urbanization - leads to better modelling performance with respect to temperatures, the boundary-layer, clouds and precipitation. The increasing computational power enables the climate research community to address climate-change projections with higher accuracy and much more detail. In the framework of the CORDEX.be project aiming for coherent high-resolution micro-ensemble projections for Belgium employing different GCMs and RCMs, the KU Leuven contributes by means of the downscaling of EC-EARTH global climate model projections (provided by the Royal Meteorological Institute of the Netherlands) to the Belgian domain. The downscaling is obtained with regional climate simulations at 12.5km resolution over Europe (CORDEX-EU domain) and at 2.8km resolution over Belgium (CORDEX.be domain) using COSMO-CLM coupled to urban land-surface parametrization TERRA_URB. This is done for the present-day (1975-2005) and future (2040 → 2070 and 2070 → 2100). In these high-resolution runs, both GHG changes (in accordance to RCP8.5) and urban land-use changes (in accordance to a business-as-usual urban expansion scenario) are taken into account. Based on these simulations, it is shown how climate-change statistics are modified when going from coarse resolution modelling to high-resolution modelling. The climate-change statistics of particular interest are the changes in number of extreme precipitation events and extreme heat waves in cities. Hereby, it is futher investigated for the robustness of the signal change between the course and high-resolution and whether a (statistical) translation is possible. The different simulations also allow to address the relative impact and synergy between the urban expansion and increased GHG on the climate-change statistics. Hereby, it is investigated for which climate-change statistics the

  12. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  13. Data Collection Manual for Academic and Research Library Network Statistics and Performance Measures.

    Science.gov (United States)

    Shim, Wonsik "Jeff"; McClure, Charles R.; Fraser, Bruce T.; Bertot, John Carlo

    This manual provides a beginning approach for research libraries to better describe the use and users of their networked services. The manual also aims to increase the visibility and importance of developing such statistics and measures. Specific objectives are: to identify selected key statistics and measures that can describe use and users of…

  14. Surface detail reproduction and dimensional accuracy of stone models: influence of disinfectant solutions and alginate impression materials.

    Science.gov (United States)

    Guiraldo, Ricardo Danil; Borsato, Thaís Teixeira; Berger, Sandrine Bittencourt; Lopes, Murilo Baena; Gonini, Alcides; Sinhoreti, Mário Alexandre Coelho

    2012-01-01

    This study compared the surface detail reproduction and dimensional accuracy of stone models obtained from molds disinfected with 2% sodium hypochlorite, 2% chlorhexidine digluconate or 0.2% peracetic acid to models produced using molds which were not disinfected, with 3 alginate materials (Cavex ColorChange, Hydrogum 5 and Jeltrate Plus). The molds were prepared over matrix containing 20-, 50-, and 75-µm lines, performed under pressure with perforated metal tray. The molds were removed following gelation and either disinfected (using one of the solutions by spraying followed by storage in closed jars for 15 min) or not disinfected. The samples were divided into 12 groups (n=5). Molds were filled with dental gypsum Durone IV and 1 h after the start of the stone mixing the models were separated from the tray. Surface detail reproduction and dimensional accuracy were evaluated using optical microscopy on the 50-µm line with 25 mm in length, in accordance with the ISO 1563 standard. The dimensional accuracy results (%) were subjected to ANOVA. The 50 µm-line was completely reproduced by all alginate impression materials regardless of the disinfection procedure. There was no statistically significant difference in the mean values of dimensional accuracy in combinations between disinfectant procedure and alginate impression material (p=0.2130) or for independent factors. The disinfectant solutions and alginate materials used in this study are no factors of choice regarding the surface detail reproduction and dimensional accuracy of stone models.

  15. Financial statistics of major investor-owned electric utilities, 1991

    International Nuclear Information System (INIS)

    1993-01-01

    The Financial Statistics of major Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues

  16. Financial statistics of selected investor-owned electric utilities, 1989

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The Financial Statistics of Selected Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.

  17. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    Science.gov (United States)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  18. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    Science.gov (United States)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  19. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  20. Energy statistics. France. August 2001

    International Nuclear Information System (INIS)

    2001-08-01

    This document summarizes in a series of tables the statistical data relative to the production, consumption, supplies, resources, and prices of energies in France: 1 - all energies (coal, oil, gas, electric power, renewable energies): supplies, uses per sector, national production and consumption of primary energies, final consumption, general indicators (energy bill, US$ change rate, prices index, prices of imported crude oil, energy independence, internal gross product, evolution between 1973 and 2000, and projections for 2020). 2 - detailed data per energy source (petroleum, natural gas, electric power, solid mineral fuels): resources, uses, and prices. An indicative comparison is made with the other countries of the European Union. (J.S.)

  1. Exchange of availability/performance data on base-load gas turbine and combined cycle plant

    Energy Technology Data Exchange (ETDEWEB)

    Jesuthasan, D.K.; Kaupang, B.M. (Tenaga Nasional Berhad (Malaysia))

    1992-09-01

    This paper describes the recommendations developed to facilitate the international exchange of availability performance data on base-load gas turbines and combined cycle plant. Standardized formats for the collection of plant availability statistics, recognizing the inherent characteristics of gas turbines in simple and combined cycle plants are presented. The formats also allow for a logical expansion of the data collection detail as that becomes desirable. To assist developing countries in particular, the approach includes basic formats for data collection needed for international reporting. In addition, the participating utilities will have a meaningful database for internal use. As experience is gained with this data colletion system, it is expected that additional detail may be accommodated to enable further in-depth performance analysis on the plant and on the utility level. 2 refs., 2 tabs., 11 apps.

  2. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  3. Oil companies' customer records as a source of petroleum statistics; Oljeselskapenes kunderegistre som kilde i petroleumsstatistikken

    Energy Technology Data Exchange (ETDEWEB)

    Isaksen, Elisabeth Thuestad; Hoeie, Henning; Flugsrud, Ketil

    2012-10-15

    Detailed sales data from oil companies' customer records are considered a better source of data for the sales statistics for petroleum products than today's more aggregated source basis. Using detailed data from sales transactions allow for a safer, more detailed and more consistent industry classification and geographic distribution of sales than what is possible with current practice. Particularly for sale to transport and the public sector will the detailed data could make a more proper distribution of sales.(eb)

  4. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  5. Shell model in large spaces and statistical spectroscopy

    International Nuclear Information System (INIS)

    Kota, V.K.B.

    1996-01-01

    For many nuclear structure problems of current interest it is essential to deal with shell model in large spaces. For this, three different approaches are now in use and two of them are: (i) the conventional shell model diagonalization approach but taking into account new advances in computer technology; (ii) the shell model Monte Carlo method. A brief overview of these two methods is given. Large space shell model studies raise fundamental questions regarding the information content of the shell model spectrum of complex nuclei. This led to the third approach- the statistical spectroscopy methods. The principles of statistical spectroscopy have their basis in nuclear quantum chaos and they are described (which are substantiated by large scale shell model calculations) in some detail. (author)

  6. The statistics of sputtering

    International Nuclear Information System (INIS)

    Robinson, M.T.

    1993-01-01

    The MARLOWE program was used to study the statistics of sputtering on the example of 1- to 100-keV Au atoms normally incident on static (001) and (111) Au crystals. The yield of sputtered atoms was examined as a function of the impact point of the incident particles (''ions'') on the target surfaces. There were variations on two scales. The effects of the axial and planar channeling of the ions could be traced, the details depending on the orientation of the target and the energies of the ions. Locally, the sputtering yield was very sensitive to the impact point, small changes in position often producing large changes yield. Results indicate strongly that the sputtering yield is a random (''chaotic'') function of the impact point

  7. Statistical report of A-bomb survivors detailed health examinations October 197 - March 1976

    International Nuclear Information System (INIS)

    Mori, Hiroyuki; Nakamura, Takeshi; Hosono, Chiharu; Inomata, Mariko; Okajima, Shunzo

    1978-01-01

    The subject was 82,705 persons, and the number of female was larger by about 16,000. The number of cases which were exposed to atomic bomb at places within 2.0 km far from the center of explosion was 476 (212 males and 264 females), and it was 10.2% of the total. With respect to a correlation of each item for general examinations estimated from the statistical values, the mean age of male was 52.9 years old, and correlations of age with the number of erythrocytes, blood sedimentation, and hemoglobin were high. The mean age of female was 53.3 years old, and a correlation of age with the maximum blood pressure was high, while correlations of age, with blood sedimentation, and hemoglobin were not so high. The number of leukocyte was directly proportional to urine sugar only in male. Correlation coefficients between urobilinogen and protein in urine were low in both female and male. A correlation between the maximum blood pressure and the minimum blood pressure was properly high, and the maximum blood pressure in both female and male was directly proportional to age. In female, both the maximum and minimum blood pressures were directly proportional to the number of erythrocytes and hemoglobin. There was the highest correlation between the distance from the center of explosion and the minimum blood pressure in female and male. Factor analysis made on the basis of the above-mentioned correlation matrix demonstrated that the first factor was erythrocyte, and the second factor was blood pressure. (Kanao, N.)

  8. Validation of risk-based performance indicators: Safety system function trends

    International Nuclear Information System (INIS)

    Boccio, J.L.; Vesely, W.E.; Azarm, M.A.; Carbonaro, J.F.; Usher, J.L.; Oden, N.

    1989-10-01

    This report describes and applies a process for validating a model for a risk-based performance indicator. The purpose of the risk-based indicator evaluated, Safety System Function Trend (SSFT), is to monitor the unavailability of selected safety systems. Interim validation of this indicator is based on three aspects: a theoretical basis, an empirical basis relying on statistical correlations, and case studies employing 25 plant years of historical data collected from five plants for a number of safety systems. Results using the SSFT model are encouraging. Application of the model through case studies dealing with the performance of important safety systems shows that statistically significant trends in, and levels of, system performance can be discerned which thereby can provide leading indications of degrading and/or improving performances. Methods for developing system performance tolerance bounds are discussed and applied to aid in the interpretation of the trends in this risk-based indicator. Some additional characteristics of the SSFT indicator, learned through the data-collection efforts and subsequent data analyses performed, are also discussed. The usefulness and practicality of other data sources for validation purposes are explored. Further validation of this indicator is noted. Also, additional research is underway in developing a more detailed estimator of system unavailability. 9 refs., 18 figs., 5 tabs

  9. Statistical monitoring of linear antenna arrays

    KAUST Repository

    Harrou, Fouzi

    2016-11-03

    The paper concerns the problem of monitoring linear antenna arrays using the generalized likelihood ratio (GLR) test. When an abnormal event (fault) affects an array of antenna elements, the radiation pattern changes and significant deviation from the desired design performance specifications can resulted. In this paper, the detection of faults is addressed from a statistical point of view as a fault detection problem. Specifically, a statistical method rested on the GLR principle is used to detect potential faults in linear arrays. To assess the strength of the GLR-based monitoring scheme, three case studies involving different types of faults were performed. Simulation results clearly shown the effectiveness of the GLR-based fault-detection method to monitor the performance of linear antenna arrays.

  10. Statistical assessment of numerous Monte Carlo tallies

    International Nuclear Information System (INIS)

    Kiedrowski, Brian C.; Solomon, Clell J.

    2011-01-01

    Four tests are developed to assess the statistical reliability of collections of tallies that number in thousands or greater. To this end, the relative-variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality. (author)

  11. Ridge Distance Estimation in Fingerprint Images: Algorithm and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Tian Jie

    2004-01-01

    Full Text Available It is important to estimate the ridge distance accurately, an intrinsic texture property of a fingerprint image. Up to now, only several articles have touched directly upon ridge distance estimation. Little has been published providing detailed evaluation of methods for ridge distance estimation, in particular, the traditional spectral analysis method applied in the frequency field. In this paper, a novel method on nonoverlap blocks, called the statistical method, is presented to estimate the ridge distance. Direct estimation ratio (DER and estimation accuracy (EA are defined and used as parameters along with time consumption (TC to evaluate performance of these two methods for ridge distance estimation. Based on comparison of performances of these two methods, a third hybrid method is developed to combine the merits of both methods. Experimental results indicate that DER is 44.7%, 63.8%, and 80.6%; EA is 84%, 93%, and 91%; and TC is , , and seconds, with the spectral analysis method, statistical method, and hybrid method, respectively.

  12. Massive Memory Revisited: Limitations on Storage Capacity for Object Details in Visual Long-Term Memory

    Science.gov (United States)

    Cunningham, Corbin A.; Yassa, Michael A.; Egeth, Howard E.

    2015-01-01

    Previous work suggests that visual long-term memory (VLTM) is highly detailed and has a massive capacity. However, memory performance is subject to the effects of the type of testing procedure used. The current study examines detail memory performance by probing the same memories within the same subjects, but using divergent probing methods. The…

  13. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  14. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  15. Statistical shape and appearance models of bones.

    Science.gov (United States)

    Sarkalkan, Nazli; Weinans, Harrie; Zadpoor, Amir A

    2014-03-01

    When applied to bones, statistical shape models (SSM) and statistical appearance models (SAM) respectively describe the mean shape and mean density distribution of bones within a certain population as well as the main modes of variations of shape and density distribution from their mean values. The availability of this quantitative information regarding the detailed anatomy of bones provides new opportunities for diagnosis, evaluation, and treatment of skeletal diseases. The potential of SSM and SAM has been recently recognized within the bone research community. For example, these models have been applied for studying the effects of bone shape on the etiology of osteoarthritis, improving the accuracy of clinical osteoporotic fracture prediction techniques, design of orthopedic implants, and surgery planning. This paper reviews the main concepts, methods, and applications of SSM and SAM as applied to bone. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  17. Airborne gamma-ray spectrometer and magnetometer survey, Durango C, Colorado. Final report Volume II A. Detail area

    International Nuclear Information System (INIS)

    1983-01-01

    Geology of Durango C detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation are included in this report. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, magnetic and ancillary profiles, and test line data

  18. Energy Statistics Manual; Manuel sur les statistiques de l'energie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  19. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    Science.gov (United States)

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  20. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    Science.gov (United States)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  1. TRAN-STAT, Issue No. 3, January 1978. Topics discussed: some statistical aspects of compositing field samples

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1978-01-01

    Some statistical aspects of compositing field samples of soils for determining the content of Pu are discussed. Some of the potential problems involved in pooling samples are reviewed. This is followed by more detailed discussions and examples of compositing designs, adequacy of mixing, statistical models and their role in compositing, and related topics

  2. Massive memory revisited: Limitations on storage capacity for object details in visual long-term memory

    OpenAIRE

    Cunningham, Corbin A.; Yassa, Michael A.; Egeth, Howard E.

    2015-01-01

    Previous work suggests that visual long-term memory (VLTM) is highly detailed and has a massive capacity. However, memory performance is subject to the effects of the type of testing procedure used. The current study examines detail memory performance by probing the same memories within the same subjects, but using divergent probing methods. The results reveal that while VLTM representations are typically sufficient to support performance when the procedure probes gist-based information, they...

  3. Detailed free span assessment for Mexilhao flow lines

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Antonio; Franco, Luciano; Eigbe, Uwa; BomfimSilva, Carlos [INTECSEA, Houston, TX (United States); Escudero, Carlos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The subsea gas production system of Mexilhao Field SPS-35, Santos Basin, offshore Brazil, is composed basically of two rigid 12.75 inches production flow lines approximately 21 km long installed in a fairly rough seabed. During the basic design, the free span assessment was performed considering the maximum allowable free span length determined by the response model proposed by DNV-RP-F105. This approach resulted in a large number of predicted free span requiring corrections, leading to a higher capital cost for the project. In this sense, a detailed free span VIV fatigue assessment was proposed, considering multi-spans and multi-mode effects and also the post lay survey data. The assessment followed the DNV-RP-F105 recommendations for multi-spans and multi-mode effects, using Finite Element Analysis to determine the natural frequencies, mode shapes and corresponding stresses associated with the mode shapes. The assessment was performed in three stages, the first during the detailed design as part of the bottom roughness analysis using the expected residual pipelay tension. The second stage was performed after pipelay, considering the post-lay survey data, where the actual requirements for span correction were determined. Actual pipelay tension was used and seabed soil stiffness adjusted in the model to match the as-laid pipeline profile obtained from the survey data. The first and second stage assessments are seamlessly automated to speed up the evaluation process and allow for quick response in the field, which was important to keep the construction vessel time minimized. The third stage was performed once the corrections of the spans were made and the purpose was to confirm that the new pipeline configuration along the supported spans had sufficient fatigue life for the temporary and operational phases. For the assessment of all three stages, the probability of occurrence and directionality of the near bottom current was considered to improve prediction of the

  4. Pauli Paramagnetic Susceptibility of an Ideal Anyon Gas within Haldane Fractional Exclusion Statistics

    International Nuclear Information System (INIS)

    Qin Fang; Chen Jisheng

    2012-01-01

    The finite-temperature Pauli paramagnetic susceptibility of a three-dimensional ideal anyon gas obeying Haldane fractional exclusion statistics is studied analytically. Different from the result of an ideal Fermi gas, the susceptibility of an ideal anyon gas depends on a statistical factor g in Haldane statistics model. The low-temperature and high-temperature behaviors of the susceptibility are investigated in detail. The Pauli paramagnetic susceptibility of the two-dimensional ideal anyons is also derived. It is found that the reciprocal of the susceptibility has the similar factorizable property which is exhibited in some thermodynamic quantities in two dimensions.

  5. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  6. Swiss energy statistics 2007

    International Nuclear Information System (INIS)

    2008-01-01

    This comprehensive report presents the Swiss Federal Office of Energy's statistics on energy production and consumption in Switzerland in 2007. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The article also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2007 and energy use in various sectors are presented. Finally, the Swiss energy balance with reference to the use of renewable sources of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power

  7. Swiss energy statistics 2000

    International Nuclear Information System (INIS)

    2001-01-01

    This comprehensive report presents the Swiss Federal Office of Energy's statistics on energy production and consumption in Switzerland in 2000. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The article also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2000 and energy use in various sectors are presented. Finally, the Swiss energy balance with reference to the use of renewable sources of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power

  8. Swiss energy statistics 2001

    International Nuclear Information System (INIS)

    2002-01-01

    This comprehensive report presents the Swiss Federal Office of Energy's statistics on energy production and consumption in Switzerland in 2001. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The article also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2001 and energy use in various sectors are presented. Finally, the Swiss energy balance with reference to the use of renewable sources of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power

  9. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  10. Generic Reliability-Based Inspection Planning for Fatigue Sensitive Details

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Straub, Daniel; Faber, Michael Havbro

    2005-01-01

    of fatigue sensitive details in fixed offshore steel jacket platforms and FPSO ship structures. Inspection and maintenance activities are planned such that code based requirements to the safety of personnel and environment for the considered structure are fulfilled and at the same time such that the overall......The generic approach for planning of in-service NDT inspections is extended to cover the case where the fatigue load is modified during the design lifetime of the structure. Generic reliability-based inspection planning has been developed as a practical approach to perform inspection planning...... expected costs for design, inspections, repairs and failures are minimized. The method is based on the assumption of “no-finds” of cracks during inspections. Each fatigue sensitive detail is categorized according to their type of details (SN curves), FDF values, RSR values, inspection, repair and failure...

  11. The effect of giving detailed information about intravenous radiopharmaceutical administration on the anxiety level of patients who request more information

    International Nuclear Information System (INIS)

    Kaya, E.; Ciftci, I.; Demirel, R.; Gecici, O.; Cigerci, Y.

    2010-01-01

    Nuclear medicine procedures use radiopharmaceuticals, which produce radiation and potential adverse reactions, albeit at a low rate. It is the patient's ethical, legal, and medical right to be informed of the potential side effects of procedures applied to them. Our purpose was to determine the effect of providing information about intravenous radiopharmaceutical administration on the anxiety level of patients who request more information. This study was completed in two separate Nuclear Medicine Departments. The study included 620 (247 M, 373 F) patients who had been referred for myocardial perfusion, bone, dynamic renal, and thyroid scintigraphic examinations. The patients were divided into two groups according to whether they requested more information or not. Group 1 consisted of 388 patients who wanted to receive more information about the procedure, while Group 2 consisted of 232 patients who did not request additional information. The State-Trait Anxiety Inventory (STAI-S and STAI-T) was used to determine a patient's anxiety level. After simple information was given, state and trait anxiety levels were measured in both groups. We gave detailed information to the patients in Group 1 and then measured state anxiety again. Detailed information included an explanation of the radiopharmaceutical risk and probable side effects due to the scan procedure. There was no statistical difference between Groups 1 and 2 in STAI-T or STAI-S scores after simple information was given (p=0.741 and p=0.945, respectively). The mean value of STAI-S score was increased after the provision of detailed information and there was a statistically significant difference between after simple information STAI-S and after detailed information STAI-S (p<0.001). The STAI-S score was increased in 246 patients and decreased in 110 patients after detailed information, while there was no change in 32 patients. After detailed information, the greatest increase in STAI-S score was seen in the

  12. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  13. Student and Professor Gender Effects in Introductory Business Statistics

    Science.gov (United States)

    Haley, M. Ryan; Johnson, Marianne F.; Kuennen, Eric W.

    2007-01-01

    Studies have yielded highly mixed results as to differences in male and female student performance in statistics courses; the role that professors play in these differences is even less clear. In this paper, we consider the impact of professor and student gender on student performance in an introductory business statistics course taught by…

  14. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    Science.gov (United States)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  15. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  16. Statistical bootstrap approach to hadronic matter and multiparticle reactions

    International Nuclear Information System (INIS)

    Ilgenfritz, E.M.; Kripfganz, J.; Moehring, H.J.

    1977-01-01

    The authors present the main ideas behind the statistical bootstrap model and recent developments within this model related to the description of fireball cascade decay. Mathematical methods developed in this model might be useful in other phenomenological schemes of strong interaction physics; they are described in detail. The present status of applications of the model to various hadronic reactions is discussed. When discussing the relations of the statistical bootstrap model to other models of hadron physics the authors point out possibly fruitful analogies and dynamical mechanisms which are modelled by the bootstrap dynamics under definite conditions. This offers interpretations for the critical temperature typical for the model and indicates futher fields of application. (author)

  17. Quantum-statistical kinetic equations

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived

  18. High Resolution 3D Experimental Investigation of Flow Structures and Turbulence Statistics in the Viscous and Buffer Layer

    Science.gov (United States)

    Sheng, Jian; Malkiel, Edwin; Katz, Joseph

    2006-11-01

    Digital Holographic Microscopy is implemented to perform 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. The measurements are performed at a resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50), sufficient for resolving buffer layer structures and for measuring the instantaneous wall shear stress distributions from velocity gradients in the sublayer. The data provides detailed statistics on the spatial distribution of both wall shear stress components along with the characteristic flow structures, including streamwise counter-rotating vortex pairs, multiple streamwise vortices, and rare hairpins. Conditional sampling identifies characteristic length scales of 70 wall units in spanwise and 10 wall units in wall-normal direction. In the region of high stress, the conditionally averaged flow consists of a stagnation-like sweeping motion induced by a counter rotating pair of streamwise vortices. Regions with low stress are associated with ejection motion, also generated by pairs of counter-rotating vortices. Statistics on the local strain and geometric alignment between strain and vorticity shows that the high shear generating vortices are inclined at 45 to streamwise direction, indicating that vortices are being stretched. Results of on-going analysis examines statistics of helicity, strain and impacts of near-wall structures.

  19. On Detailing in Contemporary Architecture

    DEFF Research Database (Denmark)

    Kristensen, Claus; Kirkegaard, Poul Henning

    2010-01-01

    Details in architecture have a significant influence on how architecture is experienced. One can touch the materials and analyse the detailing - thus details give valuable information about the architectural scheme as a whole. The absence of perceptual stimulation like details and materiality...... / tactility can blur the meaning of the architecture and turn it into an empty statement. The present paper will outline detailing in contemporary architecture and discuss the issue with respect to architectural quality. Architectural cases considered as sublime piece of architecture will be presented...

  20. Quarterly oil statistics. First quarter 1978

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    The aim of this report is to provide rapid, accurate and detailed statistics on oil supply and demand in the OECD area. Main components of the system are: complete balances of production, trade, refinery intake and output, final consumption, stock levels and changes; separate data for crude oil, NGL, feedstocks and nine product groups; separate trade data for main product groups, LPG and naphtha; imports for 41 origins; exports for 29 destinations; marine bunkers and deliveries to international civil aviation by product group; aggregates of quarterly data to annual totals; and natural gas supply and consumption.