WorldWideScience

Sample records for preliminary statistical analyses

  1. Preliminary results of statistical dynamic experiments on a heat exchanger

    International Nuclear Information System (INIS)

    Corran, E.R.; Cummins, J.D.

    1962-10-01

    The inherent noise signals present in a heat exchanger have been recorded and analysed in order to determine some of the statistical dynamic characteristics of the heat exchanger. These preliminary results show that the primary side temperature frequency response may be determined by analysing the inherent noise. The secondary side temperature frequency response and cross coupled temperature frequency responses between primary and secondary are poorly determined because of the presence of a non-stationary noise source in the secondary circuit of this heat exchanger. This may be overcome by correlating the dependent variables with an externally applied noise signal. Some preliminary experiments with an externally applied random telegraph type of signal are reported. (author)

  2. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  3. The SNS target station preliminary Title I shielding analyses

    International Nuclear Information System (INIS)

    Johnson, J.O.; Santoro, R.T.; Lillie, R.A.; Barnes, J.M.; McNeilly, G.S.

    2000-01-01

    The Department of Energy (DOE) has given the Spallation Neutron Source (SNS) project approval to begin Title I design of the proposed facility to be built at Oak Ridge National Laboratory (ORNL). During the conceptual design phase of the SNS project, the target station bulk-biological shield was characterized and the activation of the major targets station components was calculated. Shielding requirements were assessed with respect to weight, space, and dose-rate constraints for operating, shut-down, and accident conditions utilizing the SNS shield design criteria, DOE Order 5480.25, and requirements specified in 10 CFR 835. Since completion of the conceptual design phase, there have been major design changes to the target station as a result of the initial shielding and activation analyses, modifications brought about due to engineering concerns, and feedback from numerous external review committees. These design changes have impacted the results of the conceptual design analyses, and consequently, have required a re-investigation of the new design. Furthermore, the conceptual design shielding analysis did not address many of the details associated with the engineering design of the target station. In this paper, some of the proposed SNS target station preliminary Title I shielding design analyses will be presented. The SNS facility (with emphasis on the target station), shielding design requirements, calculational strategy, and source terms used in the analyses will be described. Preliminary results and conclusions, along with recommendations for additional analyses, will also be presented. (author)

  4. Statistical and extra-statistical considerations in differential item functioning analyses

    Directory of Open Access Journals (Sweden)

    G. K. Huysamen

    2004-10-01

    Full Text Available This article briefly describes the main procedures for performing differential item functioning (DIF analyses and points out some of the statistical and extra-statistical implications of these methods. Research findings on the sources of DIF, including those associated with translated tests, are reviewed. As DIF analyses are oblivious of correlations between a test and relevant criteria, the elimination of differentially functioning items does not necessarily improve predictive validity or reduce any predictive bias. The implications of the results of past DIF research for test development in the multilingual and multi-cultural South African society are considered. Opsomming Hierdie artikel beskryf kortliks die hoofprosedures vir die ontleding van differensiële itemfunksionering (DIF en verwys na sommige van die statistiese en buite-statistiese implikasies van hierdie metodes. ’n Oorsig word verskaf van navorsingsbevindings oor die bronne van DIF, insluitend dié by vertaalde toetse. Omdat DIF-ontledings nie die korrelasies tussen ’n toets en relevante kriteria in ag neem nie, sal die verwydering van differensieel-funksionerende items nie noodwendig voorspellingsgeldigheid verbeter of voorspellingsydigheid verminder nie. Die implikasies van vorige DIF-navorsingsbevindings vir toetsontwikkeling in die veeltalige en multikulturele Suid-Afrikaanse gemeenskap word oorweeg.

  5. Preliminary analyses of AP600 using RELAP5

    International Nuclear Information System (INIS)

    Modro, S.M.; Beelman, R.J.; Fisher, J.E.

    1991-01-01

    This paper presents results of preliminary analyses of the proposed Westinghouse Electric Corporation AP600 design. AP600 is a two loop, 600 MW (e) pressurized water reactor (PWR) arranged in a two hot leg, four cold leg nuclear steam supply system (NSSS) configuration. In contrast to the present generation of PWRs it is equipped with passive emergency core coolant (ECC) systems. Also, the containment and the safety systems of the AP600 interact with the reactor coolant system and each other in a more integral fashion than present day PWRs. The containment in this design is the ultimate heat sink for removal of decay heat to the environment. Idaho National Engineering Laboratory (INEL) has studied applicability of the RELAP5 code to AP600 safety analysis and has developed a model of the AP600 for the Nuclear Regulatory Commission. The model incorporates integral modeling of the containment, NSSS and passive safety systems. Best available preliminary design data were used. Nodalization sensitivity studies were conducted to gain experience in modeling of systems and conditions which are beyond the applicability of previously established RELAP5 modeling guidelines or experience. Exploratory analyses were then undertaken to investigate AP600 system response during postulated accident conditions. Four small break LOCA calculations and two large break LOCA calculations were conducted

  6. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  7. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  8. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  9. Systematic Mapping and Statistical Analyses of Valley Landform and Vegetation Asymmetries Across Hydroclimatic Gradients

    Science.gov (United States)

    Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.

    2015-12-01

    Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley

  10. European passive plant program preliminary safety analyses to support system design

    International Nuclear Information System (INIS)

    Saiu, Gianfranco; Barucca, Luciana; King, K.J.

    1999-01-01

    In 1994, a group of European Utilities, together with Westinghouse and its Industrial Partner GENESI (an Italian consortium including ANSALDO and FIAT), initiated a program designated EPP (European Passive Plant) to evaluate Westinghouse Passive Nuclear Plant Technology for application in Europe. In the Phase 1 of the European Passive Plant Program which was completed in 1996, a 1000 MWe passive plant reference design (EP1000) was established which conforms to the European Utility Requirements (EUR) and is expected to meet the European Safety Authorities requirements. Phase 2 of the program was initiated in 1997 with the objective of developing the Nuclear Island design details and performing supporting analyses to start development of Safety Case Report (SCR) for submittal to European Licensing Authorities. The first part of Phase 2, 'Design Definition' phase (Phase 2A) was completed at the end of 1998, the main efforts being design definition of key systems and structures, development of the Nuclear Island layout, and performing preliminary safety analyses to support design efforts. Incorporation of the EUR has been a key design requirement for the EP1000 form the beginning of the program. Detailed design solutions to meet the EUR have been defined and the safety approach has also been developed based on the EUR guidelines. The present paper describes the EP1000 approach to safety analysis and, in particular, to the Design Extension Conditions that, according to the EUR, represent the preferred method for giving consideration to the Complex Sequences and Severe Accidents at the design stage without including them in the design bases conditions. Preliminary results of some DEC analyses and an overview of the probabilistic safety assessment (PSA) are also presented. (author)

  11. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  12. Statistical analysis of lightning electric field measured under Malaysian condition

    Science.gov (United States)

    Salimi, Behnam; Mehranzamir, Kamyar; Abdul-Malek, Zulkurnain

    2014-02-01

    Lightning is an electrical discharge during thunderstorms that can be either within clouds (Inter-Cloud), or between clouds and ground (Cloud-Ground). The Lightning characteristics and their statistical information are the foundation for the design of lightning protection system as well as for the calculation of lightning radiated fields. Nowadays, there are various techniques to detect lightning signals and to determine various parameters produced by a lightning flash. Each technique provides its own claimed performances. In this paper, the characteristics of captured broadband electric fields generated by cloud-to-ground lightning discharges in South of Malaysia are analyzed. A total of 130 cloud-to-ground lightning flashes from 3 separate thunderstorm events (each event lasts for about 4-5 hours) were examined. Statistical analyses of the following signal parameters were presented: preliminary breakdown pulse train time duration, time interval between preliminary breakdowns and return stroke, multiplicity of stroke, and percentages of single stroke only. The BIL model is also introduced to characterize the lightning signature patterns. Observations on the statistical analyses show that about 79% of lightning signals fit well with the BIL model. The maximum and minimum of preliminary breakdown time duration of the observed lightning signals are 84 ms and 560 us, respectively. The findings of the statistical results show that 7.6% of the flashes were single stroke flashes, and the maximum number of strokes recorded was 14 multiple strokes per flash. A preliminary breakdown signature in more than 95% of the flashes can be identified.

  13. Statistical analyses of digital collections: Using a large corpus of systematic reviews to study non-citations

    DEFF Research Database (Denmark)

    Frandsen, Tove Faber; Nicolaisen, Jeppe

    2017-01-01

    Using statistical methods to analyse digital material for patterns makes it possible to detect patterns in big data that we would otherwise not be able to detect. This paper seeks to exemplify this fact by statistically analysing a large corpus of references in systematic reviews. The aim...

  14. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  15. Preliminary thermal-hydraulic and structural strength analyses for pre-moderator of cold moderator

    International Nuclear Information System (INIS)

    Aso, Tomokazu; Kaminaga, Masanori; Terada, Atsuhiko; Hino, Ryutaro

    2001-08-01

    A light-water cooled pre-moderator with a thin-walled structure made of aluminum alloy is installed around a liquid hydrogen moderator in order to enhance the neutron performance of a MW-scale spallation target system which is being developed in the Japan Atomic Energy Research Institute (JAERI). Since the pre-moderator is needed to be located close to a target working as a neutron source, it is indispensable to remove nuclear heat deposition in the pre-moderator effectively by means of smooth water flow without flow stagnation. Also, the structural integrity of the thin-walled structure should be kept against the water pressure. Preliminary thermal-hydraulic analytical results showed that the water temperature rise could be suppressed less than 1degC while keeping the smooth water flow, which would assure the expected neutron performance. As for the structural integrity, several measures to meet allowable stress conditions of aluminum alloy were proposed on the basis of the preliminary structural strength analyses. (author)

  16. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  17. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  18. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  19. Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA

    Science.gov (United States)

    Thorndahl, S.; Smith, J. A.; Krajewski, W. F.

    2012-04-01

    During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and

  20. Chemical analyses of rocks, minerals, and detritus, Yucca Mountain--Preliminary report, special report No. 11

    International Nuclear Information System (INIS)

    Hill, C.A.; Livingston, D.E.

    1993-09-01

    This chemical analysis study is part of the research program of the Yucca Mountain Project intended to provide the State of Nevada with a detailed assessment of the geology and geochemistry of Yucca Mountain and adjacent regions. This report is preliminary in the sense that more chemical analyses may be needed in the future and also in the sense that these chemical analyses should be considered as a small part of a much larger geological data base. The interpretations discussed herein may be modified as that larger data base is examined and established. All of the chemical analyses performed to date are shown in Table 1. There are three parts to this table: (1) trace element analyses on rocks (limestone and tuff) and minerals (calcite/opal), (2) rare earth analyses on rocks (tuff) and minerals (calcite/opal), and (3) major element analyses + CO 2 on rocks (tuff) and detritus sand. In this report, for each of the three parts of the table, the data and its possible significance will be discussed first, then some overall conclusions will be made, and finally some recommendations for future work will be offered

  1. Statistical analyses of the data on occupational radiation expousure at JPDR

    International Nuclear Information System (INIS)

    Kato, Shohei; Anazawa, Yutaka; Matsuno, Kenji; Furuta, Toshishiro; Akiyama, Isamu

    1980-01-01

    In the statistical analyses of the data on occupational radiation exposure at JPDR, statistical features were obtained as follows. (1) The individual doses followed log-normal distribution. (2) In the distribution of doses from one job in controlled area, the logarithm of the mean (μ) depended on the exposure rate (γ(mR/h)), and the σ correlated to the nature of the job and normally distributed. These relations were as follows. μ = 0.48 ln r-0.24, σ = 1.2 +- 0.58 (3) For the data containing different groups, the distribution of doses showed a polygonal line on the log-normal probability paper. (4) Under the dose limitation, the distribution of the doses showed asymptotic curve along the limit on the log-normal probability paper. (author)

  2. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  3. Statistical analyses of conserved features of genomic islands in bacteria.

    Science.gov (United States)

    Guo, F-B; Xia, Z-K; Wei, W; Zhao, H-L

    2014-03-17

    We performed statistical analyses of five conserved features of genomic islands of bacteria. Analyses were made based on 104 known genomic islands, which were identified by comparative methods. Four of these features include sequence size, abnormal G+C content, flanking tRNA gene, and embedded mobility gene, which are frequently investigated. One relatively new feature, G+C homogeneity, was also investigated. Among the 104 known genomic islands, 88.5% were found to fall in the typical length of 10-200 kb and 80.8% had G+C deviations with absolute values larger than 2%. For the 88 genomic islands whose hosts have been sequenced and annotated, 52.3% of them were found to have flanking tRNA genes and 64.7% had embedded mobility genes. For the homogeneity feature, 85% had an h homogeneity index less than 0.1, indicating that their G+C content is relatively uniform. Taking all the five features into account, 87.5% of 88 genomic islands had three of them. Only one genomic island had only one conserved feature and none of the genomic islands had zero features. These statistical results should help to understand the general structure of known genomic islands. We found that larger genomic islands tend to have relatively small G+C deviations relative to absolute values. For example, the absolute G+C deviations of 9 genomic islands longer than 100,000 bp were all less than 5%. This is a novel but reasonable result given that larger genomic islands should have greater restrictions in their G+C contents, in order to maintain the stable G+C content of the recipient genome.

  4. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  5. Preliminary Results of Ancillary Safety Analyses Supporting TREAT LEU Conversion Activities

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, A. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Fei, T. [Argonne National Lab. (ANL), Argonne, IL (United States); Strons, P. S. [Argonne National Lab. (ANL), Argonne, IL (United States); Papadias, D. D. [Argonne National Lab. (ANL), Argonne, IL (United States); Hoffman, E. A. [Argonne National Lab. (ANL), Argonne, IL (United States); Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-01

    Report (FSAR) [3]. Depending on the availability of historical data derived from HEU TREAT operation, results calculated for the LEU core are compared to measurements obtained from HEU TREAT operation. While all analyses in this report are largely considered complete and have been reviewed for technical content, it is important to note that all topics will be revisited once the LEU design approaches its final stages of maturity. For most safety significant issues, it is expected that the analyses presented here will be bounding, but additional calculations will be performed as necessary to support safety analyses and safety documentation. It should also be noted that these analyses were completed as the LEU design evolved, and therefore utilized different LEU reference designs. Preliminary shielding, neutronic, and thermal hydraulic analyses have been completed and have generally demonstrated that the various LEU core designs will satisfy existing safety limits and standards also satisfied by the existing HEU core. These analyses include the assessment of the dose rate in the hodoscope room, near a loaded fuel transfer cask, above the fuel storage area, and near the HEPA filters. The potential change in the concentration of tramp uranium and change in neutron flux reaching instrumentation has also been assessed. Safety-significant thermal hydraulic items addressed in this report include thermally-induced mechanical distortion of the grid plate, and heating in the radial reflector.

  6. Statistical reporting errors and collaboration on statistical analyses in psychological science

    NARCIS (Netherlands)

    Veldkamp, C.L.S.; Nuijten, M.B.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we

  7. A weighted U-statistic for genetic association analyses of sequencing data.

    Science.gov (United States)

    Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing

    2014-12-01

    With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.

  8. Translation of the Manchester Clinical Supervision Scale (MCSS) into Danish and a preliminary psychometric validation

    DEFF Research Database (Denmark)

    Buus, Niels; Gonge, Henrik

    2013-01-01

    for the translation of the MCSS from English into Danish and to present a preliminary psychometric validation of the Danish version of the scale. Methods included a formal translation/back-translation procedure and statistical analyses. The sample consisted of MCSS scores from 139 Danish mental health nursing staff...

  9. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  10. Applied statistics a handbook of BMDP analyses

    CERN Document Server

    Snell, E J

    1987-01-01

    This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...

  11. Preliminary results of sup(40)Ca(e,e'c) reaction analysis c p,α, based on statistical model

    International Nuclear Information System (INIS)

    Herdade, S.B.; Emrich, H.J.

    1990-01-01

    Statistical model calculations relative to the reactions sup(40)Ca (e,e'p) sup(39)K and sup(40)Ca(e,e'P sub(o)) sup(39)K sup(gs), using a modified version of the program STAPRE are compared with experimental results obtained from coincidence experiments carried out at the Mainz microtron MAMI A. Preliminary results indicate that the statistical decay of a 1 sup(-) level in the sup(40)Ca compound nucleus, at an excitation energy + 20 MeV, to the ground state of the sup(39)K residual nucleus is only about 15% of the total decay, indicating that direct and/or semi-direct mechanisms contribute to the major part of the decay. (author)

  12. Preliminary Statistics from the NASA Alphasat Beacon Receiver in Milan, Italy

    Science.gov (United States)

    Nessel, James; Zemba, Michael; Morse, Jacquelynne; Luini, Lorenzo; Riva, Carlo

    2015-01-01

    NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have initiated a joint propagation campaign within the framework of the Alphasat propagation experiment to characterize rain attenuation, scintillation, and gaseous absorption effects of the atmosphere in the 40 gigahertz band. NASA GRC has developed and installed a K/Q-band (20/40 gigahertz) beacon receiver at the POLIMI campus in Milan, Italy, which receives the 20/40 gigahertz signals broadcast from the Alphasat Aldo Paraboni TDP no. 5 beacon payload. The primary goal of these measurements is to develop a physical model to improve predictions of communications systems performance within the Q-band. Herein, we provide an overview of the design and data calibration procedure, and present 6 months of preliminary statistics of the NASA propagation terminal, which has been installed and operating in Milan since May 2014. The Q-band receiver has demonstrated a dynamic range of 40 decibels at an 8-hertz sampling rate. A weather station with an optical disdrometer is also installed to characterize rain drop size distribution for correlation with physical based models.

  13. Preliminary analyses on hydrogen diffusion through small break of thermo-chemical IS process hydrogen plant

    International Nuclear Information System (INIS)

    Somolova, Marketa; Terada, Atsuhiko; Takegami, Hiroaki; Iwatsuki, Jin

    2008-12-01

    Japan Atomic Energy Agency has been conducting a conceptual design study of nuclear hydrogen demonstration plant, that is, a thermal-chemical IS process hydrogen plant coupled with the High temperature Engineering Test Reactor (HTTR-IS), which will be planed to produce a large amount of hydrogen up to 1000m 3 /h. As part of the conceptual design work of the HTTR-IS system, preliminary analyses on small break of a hydrogen pipeline in the IS process hydrogen plant was carried out as a first step of the safety analyses. This report presents analytical results of hydrogen diffusion behaviors predicted with a CFD code, in which a diffusion model focused on the turbulent Schmidt number was incorporated. By modifying diffusion model, especially a constant accompanying the turbulent Schmidt number in the diffusion term, analytical results was made agreed well with the experimental results. (author)

  14. Statistical reliability analyses of two wood plastic composite extrusion processes

    International Nuclear Information System (INIS)

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  15. Preliminary drift design analyses for nuclear waste repository in tuff

    International Nuclear Information System (INIS)

    Hardy, M.P.; Brechtel, C.E.; Goodrich, R.R.; Bauer, S.J.

    1990-01-01

    The Yucca Mountain Project (YMP) is examining the feasibility of siting a repository for high-level nuclear waste at Yucca Mountain, on and adjacent to the Nevada Test Site (NTS). The proposed repository will be excavated in the Topopah Spring Member, which is a moderately fractured, unsaturated, welded tuff. Excavation stability will be required during construction, waste emplacement, retrieval (if required), and closure to ensure worker safety. The subsurface excavations will be subject to stress changes resulting from thermal expansion of the rock mass and seismic events associated with regional tectonic activity and underground nuclear explosions (UNEs). Analyses of drift stability are required to assess the acceptable waste emplacement density, to design the drift shapes and ground support systems, and to establish schedules and cost of construction. This paper outlines the proposed methodology to assess drift stability and then focuses on an example of its application to the YMP repository drifts based on preliminary site data. Because site characterization activities have not begun, the database currently lacks the extensive site-specific field and laboratory data needed to form conclusions as to the final ground support requirements. This drift design methodology will be applied and refined as more site-specific data are generated and as analytical techniques and methodologies are verified during the site characterization process

  16. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  17. Characteristics of electrostatic solitary waves observed in the plasma sheet boundary: Statistical analyses

    Directory of Open Access Journals (Sweden)

    H. Kojima

    1999-01-01

    Full Text Available We present the characteristics of the Electrostatic Solitary Waves (ESW observed by the Geotail spacecraft in the plasma sheet boundary layer based on the statistical analyses. We also discuss the results referring to a model of ESW generation due to electron beams, which is proposed by computer simulations. In this generation model, the nonlinear evolution of Langmuir waves excited by electron bump-on-tail instabilities leads to formation of isolated electrostatic potential structures corresponding to "electron hole" in the phase space. The statistical analyses of the Geotail data, which we conducted under the assumption that polarity of ESW potentials is positive, show that most of ESW propagate in the same direction of electron beams, which are observed by the plasma instrument, simultaneously. Further, we also find that the ESW potential energy is much smaller than the background electron thermal energy and that the ESW potential widths are typically shorter than 60 times of local electron Debye length when we assume that the ESW potentials travel in the same velocity of electron beams. These results are very consistent with the ESW generation model that the nonlinear evolution of electron bump-on-tail instability leads to the formation of electron holes in the phase space.

  18. SARDA HITL Preliminary Human Factors Measures and Analyses

    Science.gov (United States)

    Hyashi, Miwa; Dulchinos, Victoria

    2012-01-01

    Human factors data collected during the SARDA HITL Simulation Experiment include a variety of subjective measures, including the NASA TLX, questionnaire questions regarding situational awareness, advisory usefulness, UI usability, and controller trust. Preliminary analysis of the TLX data indicate that workload may not be adversely affected by use of the advisories, additionally, the controller's subjective ratings of the advisories may suggest acceptance of the tool.

  19. arXiv Statistical Analyses of Higgs- and Z-Portal Dark Matter Models

    CERN Document Server

    Ellis, John; Marzola, Luca; Raidal, Martti

    2018-06-12

    We perform frequentist and Bayesian statistical analyses of Higgs- and Z-portal models of dark matter particles with spin 0, 1/2 and 1. Our analyses incorporate data from direct detection and indirect detection experiments, as well as LHC searches for monojet and monophoton events, and we also analyze the potential impacts of future direct detection experiments. We find acceptable regions of the parameter spaces for Higgs-portal models with real scalar, neutral vector, Majorana or Dirac fermion dark matter particles, and Z-portal models with Majorana or Dirac fermion dark matter particles. In many of these cases, there are interesting prospects for discovering dark matter particles in Higgs or Z decays, as well as dark matter particles weighing $\\gtrsim 100$ GeV. Negative results from planned direct detection experiments would still allow acceptable regions for Higgs- and Z-portal models with Majorana or Dirac fermion dark matter particles.

  20. Marine Traffic Density Over Port Klang, Malaysia Using Statistical Analysis of AIS Data: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Masnawi MUSTAFFA

    2016-12-01

    Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.

  1. Preliminary Analyses Showed Short-Term Mental Health Improvements after a Single-Day Manager Training.

    Science.gov (United States)

    Boysen, Elena; Schiller, Birgitta; Mörtl, Kathrin; Gündel, Harald; Hölzer, Michael

    2018-01-10

    Psychosocial working conditions attract more and more attention when it comes to mental health in the workplace. Trying to support managers to deal with their own as well as their employees' psychological risk factors, we conducted a specific manager training. Within this investigation, we wanted to learn about the training's effects and acceptance. A single-day manager training was provided in a large industrial company in Germany. The participants were asked to fill out questionnaires regarding their own physical and mental health condition as well as their working situation. Questionnaires were distributed at baseline, 3-month, and 12-month follow-up. At this point of time the investigation is still ongoing. The current article focuses on short-term preliminary effects. Analyses only included participants that already completed baseline and three months follow-up. Preliminary results from three-month follow-up survey ( n = 33, nmale = 30, Mage = 47.5) indicated positive changes in the manager's mental health condition measured by the Patient Health Questionnaire for depression (PHQ-9: Mt1 = 3.82, Mt2 = 3.15). Training managers about common mental disorders and risk factors at the workplace within a single-day workshop seems to promote positive effects on their own mental health. Especially working with the managers on their own early stress symptoms might have been an important element.

  2. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  3. Preliminary study of energy confinement data with a statistical analysis system in HL-2A tokamak

    International Nuclear Information System (INIS)

    Xu Yuan; Cui Zhengying; Ji Xiaoquan; Dong Chunfeng; Yang Qingwei; O J W F Kardaun

    2010-01-01

    Taking advantage of the HL-2A experimental data,an energy confinement database facing ITERL DB2.0 version has been originally established. As for this database,a world widely used statistical analysis system (SAS) has been adopted for the first time to analyze and evaluate the confinement data from HL-2A and the research on scaling laws of energy confinement time corresponding to plasma density is developed, some preliminary results having been achieved. Finally, through comparing with both ITER scaling law and previous ASDEX database, the investigation about L-mode confinement quality on HL-2A and influence of temperature on Spitzer resistivity will be discussed. (authors)

  4. Scripts for TRUMP data analyses. Part II (HLA-related data): statistical analyses specific for hematopoietic stem cell transplantation.

    Science.gov (United States)

    Kanda, Junya

    2016-01-01

    The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.

  5. NWS Weather Fatality, Injury and Damage Statistics

    Science.gov (United States)

    ... Weather Awareness Floods, Wind Chill, Tornadoes, Heat... Education Weather Terms, Teachers, Statistics government web resources and services. Natural Hazard Statistics Statistics U.S. Summaries 78-Year List of Severe Weather Fatalities Preliminary Hazardous Weather Statistics for 2017 Now

  6. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses

    OpenAIRE

    Buttigieg, Pier Luigi; Ramette, Alban Nicolas

    2014-01-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynami...

  7. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  8. The Relationship Between Radiative Forcing and Temperature. What Do Statistical Analyses of the Instrumental Temperature Record Measure?

    International Nuclear Information System (INIS)

    Kaufmann, R.K.; Kauppi, H.; Stock, J.H.

    2006-01-01

    Comparing statistical estimates for the long-run temperature effect of doubled CO2 with those generated by climate models begs the question, is the long-run temperature effect of doubled CO2 that is estimated from the instrumental temperature record using statistical techniques consistent with the transient climate response, the equilibrium climate sensitivity, or the effective climate sensitivity. Here, we attempt to answer the question, what do statistical analyses of the observational record measure, by using these same statistical techniques to estimate the temperature effect of a doubling in the atmospheric concentration of carbon dioxide from seventeen simulations run for the Coupled Model Intercomparison Project 2 (CMIP2). The results indicate that the temperature effect estimated by the statistical methodology is consistent with the transient climate response and that this consistency is relatively unaffected by sample size or the increase in radiative forcing in the sample

  9. Statistical analyses of variability/reproducibility of environmentally assisted cyclic crack growth rate data utilizing JAERI Material Performance Database (JMPD)

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo

    1993-05-01

    Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type ones from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air. (author)

  10. Statistical analyses to support guidelines for marine avian sampling. Final report

    Science.gov (United States)

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

  11. Long-term gas and brine migration at the Waste Isolation Pilot Plant: Preliminary sensitivity analyses for post-closure 40 CFR 268 (RCRA), May 1992

    International Nuclear Information System (INIS)

    1992-12-01

    This report describes preliminary probabilistic sensitivity analyses of long term gas and brine migration at the Waste Isolation Pilot Plant (WIPP). Because gas and brine are potential transport media for organic compounds and heavy metals, understanding two-phase flow in the repository and the surrounding Salado Formation is essential to evaluating long-term compliance with 40 CFR 268.6, which is the portion of the Land Disposal Restrictions of the Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act that states the conditions for disposal of specified hazardous wastes. Calculations described here are designed to provide guidance to the WIPP Project by identifying important parameters and helping to recognize processes not yet modeled that may affect compliance. Based on these analyses, performance is sensitive to shaft-seal permeabilities, parameters affecting gas generation, and the conceptual model used for the disturbed rock zone surrounding the excavation. Brine migration is less likely to affect compliance with 40 CFR 268.6 than gas migration. However, results are preliminary, and additional iterations of uncertainty and sensitivity analyses will be required to provide the confidence needed for a defensible compliance evaluation. Specifically, subsequent analyses will explicitly include effects of salt creep and, when conceptual and computational models are available, pressure-dependent fracturing of anhydrite marker beds

  12. Multivariate statistical analyses demonstrate unique host immune responses to single and dual lentiviral infection.

    Directory of Open Access Journals (Sweden)

    Sunando Roy

    2009-10-01

    Full Text Available Feline immunodeficiency virus (FIV and human immunodeficiency virus (HIV are recently identified lentiviruses that cause progressive immune decline and ultimately death in infected cats and humans. It is of great interest to understand how to prevent immune system collapse caused by these lentiviruses. We recently described that disease caused by a virulent FIV strain in cats can be attenuated if animals are first infected with a feline immunodeficiency virus derived from a wild cougar. The detailed temporal tracking of cat immunological parameters in response to two viral infections resulted in high-dimensional datasets containing variables that exhibit strong co-variation. Initial analyses of these complex data using univariate statistical techniques did not account for interactions among immunological response variables and therefore potentially obscured significant effects between infection state and immunological parameters.Here, we apply a suite of multivariate statistical tools, including Principal Component Analysis, MANOVA and Linear Discriminant Analysis, to temporal immunological data resulting from FIV superinfection in domestic cats. We investigated the co-variation among immunological responses, the differences in immune parameters among four groups of five cats each (uninfected, single and dual infected animals, and the "immune profiles" that discriminate among them over the first four weeks following superinfection. Dual infected cats mount an immune response by 24 days post superinfection that is characterized by elevated levels of CD8 and CD25 cells and increased expression of IL4 and IFNgamma, and FAS. This profile discriminates dual infected cats from cats infected with FIV alone, which show high IL-10 and lower numbers of CD8 and CD25 cells.Multivariate statistical analyses demonstrate both the dynamic nature of the immune response to FIV single and dual infection and the development of a unique immunological profile in dual

  13. A simple and robust statistical framework for planning, analysing and interpreting faecal egg count reduction test (FECRT) studies

    DEFF Research Database (Denmark)

    Denwood, M.J.; McKendrick, I.J.; Matthews, L.

    Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...

  14. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    2017-10-01

    Full Text Available Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.

  15. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Science.gov (United States)

    De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep

    2017-01-01

    Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107

  16. Preliminary analyses of scenarios for potential human interference for repositories in three salt formations

    International Nuclear Information System (INIS)

    1985-10-01

    Preliminary analyses of scenarios for human interference with the performance of a radioactive waste repository in a deep salt formation are presented. The following scenarios are analyzed: (1) the U-Tube Connection Scenario involving multiple connections between the repository and the overlying aquifer system; (2) the Single Borehole Intrusion Scenario involving penetration of the repository by an exploratory borehole that simultaneously connects the repository with overlying and underlying aquifers; and (3) the Pressure Release Scenario involving inflow of water to saturate any void space in the repository prior to creep closure with subsequent release under near lithostatic pressures following creep closure. The methodology to evaluate repository performance in these scenarios is described and this methodology is applied to reference systems in three candidate formations: bedded salt in the Palo Duro Basin, Texas; bedded salt in the Paradox Basin, Utah; and the Richton Salt Dome, Mississippi, of the Gulf Coast Salt Dome Basin

  17. "Who Was 'Shadow'?" The Computer Knows: Applying Grammar-Program Statistics in Content Analyses to Solve Mysteries about Authorship.

    Science.gov (United States)

    Ellis, Barbara G.; Dick, Steven J.

    1996-01-01

    Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)

  18. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  19. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    Science.gov (United States)

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database

    International Nuclear Information System (INIS)

    Lam, Chio; Zhou, Wenxing

    2016-01-01

    This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.

  1. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  2. Development and preliminary analyses of material balance evaluation model in nuclear fuel cycle

    International Nuclear Information System (INIS)

    Matsumura, Tetsuo

    1994-01-01

    Material balance evaluation model in nuclear fuel cycle has been developed using ORIGEN-2 code as basic engine. This model has feature of: It can treat more than 1000 nuclides including minor actinides and fission products. It has flexibility of modeling and graph output using a engineering work station. I made preliminary calculation of LWR fuel high burnup effect (reloading fuel average burnup of 60 GWd/t) on nuclear fuel cycle. The preliminary calculation shows LWR fuel high burnup has much effect on Japanese Pu balance problem. (author)

  3. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  4. Preliminary Accident Analyses for Conversion of the Massachusetts Institute of Technology Reactor (MITR) from Highly Enriched to Low Enriched Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Floyd E. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Wilson, Erik H. [Argonne National Lab. (ANL), Argonne, IL (United States); Sun, Kaichao S. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Newton, Jr., Thomas H. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Hu, Lin-wen [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2013-09-30

    The Massachusetts Institute of Technology Reactor (MITR-II) is a research reactor in Cambridge, Massachusetts designed primarily for experiments using neutron beam and in-core irradiation facilities. It delivers a neutron flux comparable to current LWR power reactors in a compact 6 MW core using Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context most research and test reactors, both domestic and international, have started a program of conversion to the use of LEU fuel. A new type of LEU fuel based on an alloy of uranium and molybdenum (U-Mo) is expected to allow the conversion of U.S. domestic high performance reactors like MITR. This report presents the preliminary accident analyses for MITR cores fueled with LEU monolithic U-Mo alloy fuel with 10 wt% Mo. Preliminary results demonstrate adequate performance, including thermal margin to expected safety limits, for the LEU accident scenarios analyzed.

  5. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  6. A preliminary study on identification of Thai rice samples by INAA and statistical analysis

    Science.gov (United States)

    Kongsri, S.; Kukusamude, C.

    2017-09-01

    This study aims to investigate the elemental compositions in 93 Thai rice samples using instrumental neutron activation analysis (INAA) and to identify rice according to their types and rice cultivars using statistical analysis. As, Mg, Cl, Al, Br, Mn, K, Rb and Zn in Thai jasmine rice and Sung Yod rice samples were successfully determined by INAA. The accuracy and precision of the INAA method were verified by SRM 1568a Rice Flour. All elements were found to be in a good agreement with the certified values. The precisions in term of %RSD were lower than 7%. The LODs were obtained in range of 0.01 to 29 mg kg-1. The concentration of 9 elements distributed in Thai rice samples was evaluated and used as chemical indicators to identify the type of rice samples. The result found that Mg, Cl, As, Br, Mn, K, Rb, and Zn concentrations in Thai jasmine rice samples are significantly different but there was no evidence that Al is significantly different from concentration in Sung Yod rice samples at 95% confidence interval. Our results may provide preliminary information for discrimination of rice samples and may be useful database of Thai rice.

  7. Preliminary design report for the NAC combined transport cask

    International Nuclear Information System (INIS)

    1990-04-01

    Nuclear Assurance Corporation (NAC) is under contract to the United States Department of Energy (DOE) to design, license, develop and test models, and fabricate a prototype cask transportation system for nuclear spent fuel. The design of this combined transport (rail/barge) transportation system has been divided into two phases, a preliminary design phase and a final design phase. This Preliminary Design Package (PDP) describes the NAC Combined Transport Cask (NAC-CTC), the results of work completed during the preliminary design phase and identifies the additional detailed analyses, which will be performed during final design. Preliminary analytical results are presented in the appropriate sections and supplemented by summaries of procedures and assumptions for performing the additional detailed analyses of the final design. 60 refs., 1 fig., 2 tabs

  8. The quality improvement attitude survey: Development and preliminary psychometric characteristics.

    Science.gov (United States)

    Dunagan, Pamela B

    2017-12-01

    To report the development of a tool to measure nurse's attitudes about quality improvement in their practice setting and to examine preliminary psychometric characteristics of the Quality Improvement Nursing Attitude Scale. Human factors such as nursing attitudes of complacency have been identified as root causes of sentinel events. Attitudes of nurses concerning use of Quality and Safety Education for nurse's competencies can be most challenging to teach and to change. No tool has been developed measuring attitudes of nurses concerning their role in quality improvement. A descriptive study design with preliminary psychometric evaluation was used to examine the preliminary psychometric characteristics of the Quality Improvement Nursing Attitude Scale. Registered bedside clinical nurses comprised the sample for the study (n = 57). Quantitative data were analysed using descriptive statistics and Cronbach's alpha reliability. Total score and individual item statistics were evaluated. Two open-ended items were used to collect statements about nurses' feelings regarding their experience in quality improvement efforts. Strong support for the internal consistency reliability and face validity of the Quality Improvement Nursing Attitude Scale was found. Total scale scores were high indicating nurse participants valued Quality and Safety Education for Nurse competencies in practice. However, item-level statistics indicated nurses felt powerless when other nurses deviate from care standards. Additionally, the sample indicated they did not consistently report patient safety issues and did not have a feeling of value in efforts to improve care. Findings suggested organisational culture fosters nurses' reporting safety issues and feeling valued in efforts to improve care. Participants' narrative comments and item analysis revealed the need to generate new items for the Quality Improvement Nursing Attitude Scale focused on nurses' perception of their importance in quality and

  9. Influence of peer review on the reporting of primary outcome(s) and statistical analyses of randomised trials.

    Science.gov (United States)

    Hopewell, Sally; Witt, Claudia M; Linde, Klaus; Icke, Katja; Adedire, Olubusola; Kirtley, Shona; Altman, Douglas G

    2018-01-11

    Selective reporting of outcomes in clinical trials is a serious problem. We aimed to investigate the influence of the peer review process within biomedical journals on reporting of primary outcome(s) and statistical analyses within reports of randomised trials. Each month, PubMed (May 2014 to April 2015) was searched to identify primary reports of randomised trials published in six high-impact general and 12 high-impact specialty journals. The corresponding author of each trial was invited to complete an online survey asking authors about changes made to their manuscript as part of the peer review process. Our main outcomes were to assess: (1) the nature and extent of changes as part of the peer review process, in relation to reporting of the primary outcome(s) and/or primary statistical analysis; (2) how often authors followed these requests; and (3) whether this was related to specific journal or trial characteristics. Of 893 corresponding authors who were invited to take part in the online survey 258 (29%) responded. The majority of trials were multicentre (n = 191; 74%); median sample size 325 (IQR 138 to 1010). The primary outcome was clearly defined in 92% (n = 238), of which the direction of treatment effect was statistically significant in 49%. The majority responded (1-10 Likert scale) they were satisfied with the overall handling (mean 8.6, SD 1.5) and quality of peer review (mean 8.5, SD 1.5) of their manuscript. Only 3% (n = 8) said that the editor or peer reviewers had asked them to change or clarify the trial's primary outcome. However, 27% (n = 69) reported they were asked to change or clarify the statistical analysis of the primary outcome; most had fulfilled the request, the main motivation being to improve the statistical methods (n = 38; 55%) or avoid rejection (n = 30; 44%). Overall, there was little association between authors being asked to make this change and the type of journal, intervention, significance of the

  10. A weighted U statistic for association analyses considering genetic heterogeneity.

    Science.gov (United States)

    Wei, Changshuai; Elston, Robert C; Lu, Qing

    2016-07-20

    Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Preliminary conceptual design and analysis on KALIMER reactor structures

    International Nuclear Information System (INIS)

    Kim, Jong Bum

    1996-10-01

    The objectives of this study are to perform preliminary conceptual design and structural analyses for KALIMER (Korea Advanced Liquid Metal Reactor) reactor structures to assess the design feasibility and to identify detailed analysis requirements. KALIMER thermal hydraulic system analysis results and neutronic analysis results are not available at present, only-limited preliminary structural analyses have been performed with the assumptions on the thermal loads. The responses of reactor vessel and reactor internal structures were based on the temperature difference of core inlet and outlet and on engineering judgments. Thermal stresses from the assumed temperatures were calculated using ANSYS code through parametric finite element heat transfer and elastic stress analyses. While, based on the results of preliminary conceptual design and structural analyses, the ASME Code limits for the reactor structures were satisfied for the pressure boundary, the needs for inelastic analyses were indicated for evaluation of design adequacy of the support barrel and the thermal liner. To reduce thermal striping effects in the bottom are of UIS due to up-flowing sodium form reactor core, installation of Inconel-718 liner to the bottom area was proposed, and to mitigate thermal shock loads, additional stainless steel liner was also suggested. The design feasibilities of these were validated through simplified preliminary analyses. In conceptual design phase, the implementation of these results will be made for the design of the reactor structures and the reactor internal structures in conjunction with the thermal hydraulic, neutronic, and seismic analyses results. 4 tabs., 24 figs., 4 refs. (Author)

  12. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  13. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  14. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  15. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  16. Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.

    Science.gov (United States)

    Deng, Yangqing; Pan, Wei

    2017-12-01

    There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the

  17. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    Science.gov (United States)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  18. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Directory of Open Access Journals (Sweden)

    Brady T West

    Full Text Available Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT, which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  19. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Science.gov (United States)

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  20. Review of Statistical Analyses Resulting from Performance of HLDWD-DWPF-005

    International Nuclear Information System (INIS)

    Beck, R.S.

    1997-01-01

    The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1 ampersand 2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of

  1. Authentication by Keystroke Timing: Some Preliminary Results

    National Research Council Canada - National Science Library

    Gaines, R. S; Lisowski, William; Press, S. J; Shapiro, Norman

    1980-01-01

    ... of an individual seeking access to the computer. This report summarizes preliminary efforts to establish whether an individual can be identified by the statistical characteristics of his or her typing...

  2. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    International Nuclear Information System (INIS)

    Clerc, F; Njiki-Menga, G-H; Witschger, O

    2013-01-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  3. A multi-criteria evaluation system for marine litter pollution based on statistical analyses of OSPAR beach litter monitoring time series.

    Science.gov (United States)

    Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael

    2013-12-01

    During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  5. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  6. Conceptual aspects: analyses law, ethical, human, technical, social factors of development ICT, e-learning and intercultural development in different countries setting out the previous new theoretical model and preliminary findings

    NARCIS (Netherlands)

    Kommers, Petrus A.M.; Smyrnova-Trybulska, Eugenia; Morze, Natalia; Issa, Tomayess; Issa, Theodora

    2015-01-01

    This paper, prepared by an international team of authors focuses on the conceptual aspects: analyses law, ethical, human, technical, social factors of ICT development, e-learning and intercultural development in different countries, setting out the previous and new theoretical model and preliminary

  7. A preliminary comparison between TOVS and GOME level 2 ozone data

    Science.gov (United States)

    Rathman, William; Monks, Paul S.; Llewellyn-Jones, David; Burrows, John P.

    1997-09-01

    A preliminary comparison between total column ozone concentration values derived from TIROS Operational Vertical Sounder (TOVS) and Global Ozone Monitoring Experiment (GOME) has been carried out. Two comparisons of ozone datasets have been made: a) TOVS ozone analysis maps vs. GOME level 2 data; b) TOVS data located at Northern Hemisphere Ground Ozone Stations (NHGOS) vs. GOME data. Both analyses consistently showed an offset in the value of the total column ozone between the datasets [for analyses a) 35 Dobson Units (DU); and for analyses b) 10 DU], despite a good correlation between the spatial and temporal features of the datasets. A noticeably poor correlation in the latitudinal bands 10°/20° North and 10°/20° South was observed—the reasons for which are discussed. The smallest region which was statistically representative of the ozone value correlation dataset of TOVS data at NHGOS and GOME level-2 data was determined to be a region that was enclosed by effective radius of 0.75 arc-degrees (83.5km).

  8. Building-related symptoms among U.S. office workers and risks factors for moisture and contamination: Preliminary analyses of U.S. EPA BASE Data

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Cozen, Myrna

    2002-09-01

    The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very

  9. Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd

    2011-01-01

    The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…

  10. Preliminary results from NOAMP deep drifting floats

    International Nuclear Information System (INIS)

    Ollitrault, M.

    1989-01-01

    This paper is a very brief and preliminary outline of first results obtained with deep SOFAR floats in the NOAMP area. The work is now going toward more precise statistical estimations of mean and variable currents, together with better tracking to resolve submesoscales and estimate diffusivities due to mesoscale and smaller scale motions. However the preliminary results confirm that the NOAMP region (and surroundings) has a deep mesoscale eddy field that is considerably more energetic that the mean field (r.m.s. velocities are of order 5 cm s -1 ), although both values are diminished compared to the western basin. A data report containing trajectories and statistics is scheduled to be published by IFREMER in the near future. The project main task is to especially study the dispersion of radioactive substances

  11. A QUANTITATIVE METHOD FOR ANALYSING 3-D BRANCHING IN EMBRYONIC KIDNEYS: DEVELOPMENT OF A TECHNIQUE AND PRELIMINARY DATA

    Directory of Open Access Journals (Sweden)

    Gabriel Fricout

    2011-05-01

    Full Text Available The normal human adult kidney contains between 300,000 and 1 million nephrons (the functional units of the kidney. Nephrons develop at the tips of the branching ureteric duct, and therefore ureteric duct branching morphogenesis is critical for normal kidney development. Current methods for analysing ureteric branching are mostly qualitative and those quantitative methods that do exist do not account for the 3- dimensional (3D shape of the ureteric "tree". We have developed a method for measuring the total length of the ureteric tree in 3D. This method is described and preliminary data are presented. The algorithm allows for performing a semi-automatic segmentation of a set of grey level confocal images and an automatic skeletonisation of the resulting binary object. Measurements of length are automatically obtained, and numbers of branch points are manually counted. The final representation can be reconstructed by means of 3D volume rendering software, providing a fully rotating 3D perspective of the skeletonised tree, making it possible to identify and accurately measure branch lengths. Preliminary data shows the total length estimates obtained with the technique to be highly reproducible. Repeat estimates of total tree length vary by just 1-2%. We will now use this technique to further define the growth of the ureteric tree in vitro, under both normal culture conditions, and in the presence of various levels of specific molecules suspected of regulating ureteric growth. The data obtained will provide fundamental information on the development of renal architecture, as well as the regulation of nephron number.

  12. Halo statistics analysis within medium volume cosmological N-body simulation

    Directory of Open Access Journals (Sweden)

    Martinović N.

    2015-01-01

    Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

  13. Reproduction of the Yucca Mountain Project TSPA-LA Uncertainty and Sensitivity Analyses and Preliminary Upgrade of Models

    Energy Technology Data Exchange (ETDEWEB)

    Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis; Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis

    2016-09-01

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.

  14. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  15. Using statistical inference for decision making in best estimate analyses

    International Nuclear Information System (INIS)

    Sermer, P.; Weaver, K.; Hoppe, F.; Olive, C.; Quach, D.

    2008-01-01

    For broad classes of safety analysis problems, one needs to make decisions when faced with randomly varying quantities which are also subject to errors. The means for doing this involves a statistical approach which takes into account the nature of the physical problems, and the statistical constraints they impose. We describe the methodology for doing this which has been developed at Nuclear Safety Solutions, and we draw some comparisons to other methods which are commonly used in Canada and internationally. Our methodology has the advantages of being robust and accurate and compares favourably to other best estimate methods. (author)

  16. 78 FR 13563 - Energy Conservation Program: Availability of the Preliminary Technical Support Document for...

    Science.gov (United States)

    2013-02-28

    ... identify and resolve issues involved in the preliminary analyses. Chapter 2 of the preliminary technical... DOE conducted in-depth technical analyses in the following areas for GSFLs and IRLs currently under... also begun work on the manufacturer impact analysis and identified the methods to be used for the LCC...

  17. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  18. Essentials of Excel, Excel VBA, SAS and Minitab for statistical and financial analyses

    CERN Document Server

    Lee, Cheng-Few; Chang, Jow-Ran; Tai, Tzu

    2016-01-01

    This introductory textbook for business statistics teaches statistical analysis and research methods via business case studies and financial data using Excel, MINITAB, and SAS. Every chapter in this textbook engages the reader with data of individual stock, stock indices, options, and futures. One studies and uses statistics to learn how to study, analyze, and understand a data set of particular interest. Some of the more popular statistical programs that have been developed to use statistical and computational methods to analyze data sets are SAS, SPSS, and MINITAB. Of those, we look at MINITAB and SAS in this textbook. One of the main reasons to use MINITAB is that it is the easiest to use among the popular statistical programs. We look at SAS because it is the leading statistical package used in industry. We also utilize the much less costly and ubiquitous Microsoft Excel to do statistical analysis, as the benefits of Excel have become widely recognized in the academic world and its analytical capabilities...

  19. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. R statistical application development by example : beginner's guide

    CERN Document Server

    Tattar, Narayanachart Prabhanjan

    2013-01-01

    Full of screenshots and examples, this Beginner's Guide by Example will teach you practically everything you need to know about R statistical application development from scratch. You will begin learning the first concepts of statistics in R which is vital in this fast paced era and it is also a bargain as you do not need to do a preliminary course on the subject.

  3. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  4. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses.

    Science.gov (United States)

    Buttigieg, Pier Luigi; Ramette, Alban

    2014-12-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.

  5. Multivariate statistical characterization of groundwater quality in Ain ...

    African Journals Online (AJOL)

    Administrator

    depends much on the sustainability of the available water resources. Water of .... 18 wells currently in use were selected based on the preliminary field survey carried out to ... In recent times, multivariate statistical methods have been applied ...

  6. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  7. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  8. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    Science.gov (United States)

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration

  9. Methodology in robust and nonparametric statistics

    CERN Document Server

    Jurecková, Jana; Picek, Jan

    2012-01-01

    Introduction and SynopsisIntroductionSynopsisPreliminariesIntroductionInference in Linear ModelsRobustness ConceptsRobust and Minimax Estimation of LocationClippings from Probability and Asymptotic TheoryProblemsRobust Estimation of Location and RegressionIntroductionM-EstimatorsL-EstimatorsR-EstimatorsMinimum Distance and Pitman EstimatorsDifferentiable Statistical FunctionsProblemsAsymptotic Representations for L-Estimators

  10. Preliminary In Vivo Experiments on Adhesion of Geckos

    OpenAIRE

    Lepore, E.; Brianza, S.; Antoniolli, F.; Buono, M.; Carpinteri, A.; Pugno, N.

    2008-01-01

    We performed preliminary experiments on the adhesion of a Tokay gecko on surfaces with different roughness, with or without particles with significant different granulometry, before/after or during the moult. The results were analyzed using the Weibull statistics.

  11. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  12. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  13. Transformation (normalization) of slope gradient and surface curvatures, automated for statistical analyses from DEMs

    Science.gov (United States)

    Csillik, O.; Evans, I. S.; Drăguţ, L.

    2015-03-01

    Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.

  14. Influence of study satisfaction on academic procrastination in psychology students: a preliminary study

    Directory of Open Access Journals (Sweden)

    Sergio Alexis Dominguez-Lara

    2017-06-01

    Full Text Available The aim of this predictive study was to analyze the degree of influence of study satisfaction (SS on academic procrastination (AP. One hundred forty- eight (148 psychology students (111 women between 18 and 32 years old (M = 22.41 were evaluated using the Brief Scale of Study Satisfaction and the Academic Procrastination Scale. After preliminary analyses focused on the scores reliability (α > 0.70 and correlations between dimensions, a regression analysis was performed to determine how much of the variability in the AP dimensions’ scores is explained by the variations in the SS. For that purpose, a method that uses bivariate correlations corrected for attenuation and provides confidence intervals under a bootstrap approach of the associated statistics was applied. All analyses were assessed from an effect size approach. The results indicate that the influence of SS on AP was not significant. These findings provide new ways to implement studies in order to understand the procrastinating behavior in the university setting.

  15. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    Science.gov (United States)

    Southard, Rodney E.

    2013-01-01

    estimates on one of these streams can be calculated at an ungaged location that has a drainage area that is between 40 percent of the drainage area of the farthest upstream streamgage and within 150 percent of the drainage area of the farthest downstream streamgage along the stream of interest. The second method may be used on any stream with a streamgage that has operated for 10 years or longer and for which anthropogenic effects have not changed the low-flow characteristics at the ungaged location since collection of the streamflow data. A ratio of drainage area of the stream at the ungaged location to the drainage area of the stream at the streamgage was computed to estimate the statistic at the ungaged location. The range of applicability is between 40- and 150-percent of the drainage area of the streamgage, and the ungaged location must be located on the same stream as the streamgage. The third method uses regional regression equations to estimate selected low-flow frequency statistics for unregulated streams in Missouri. This report presents regression equations to estimate frequency statistics for the 10-year recurrence interval and for the N-day durations of 1, 2, 3, 7, 10, 30, and 60 days. Basin and climatic characteristics were computed using geographic information system software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses based on existing digital geospatial data and previous studies. Spatial analyses for geographical bias in the predictive accuracy of the regional regression equations defined three low-flow regions with the State representing the three major physiographic provinces in Missouri. Region 1 includes the Central Lowlands, Region 2 includes the Ozark Plateaus, and Region 3 includes the Mississippi Alluvial Plain. A total of 207 streamgages were used in the regression analyses for the regional equations. Of the 207 U.S. Geological Survey streamgages, 77 were

  16. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    Science.gov (United States)

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Influence of Immersion Conditions on The Tensile Strength of Recycled Kevlar®/Polyester/Low-Melting-Point Polyester Nonwoven Geotextiles through Applying Statistical Analyses

    Directory of Open Access Journals (Sweden)

    Jing-Chzi Hsieh

    2016-05-01

    Full Text Available The recycled Kevlar®/polyester/low-melting-point polyester (recycled Kevlar®/PET/LPET nonwoven geotextiles are immersed in neutral, strong acid, and strong alkali solutions, respectively, at different temperatures for four months. Their tensile strength is then tested according to various immersion periods at various temperatures, in order to determine their durability to chemicals. For the purpose of analyzing the possible factors that influence mechanical properties of geotextiles under diverse environmental conditions, the experimental results and statistical analyses are incorporated in this study. Therefore, influences of the content of recycled Kevlar® fibers, implementation of thermal treatment, and immersion periods on the tensile strength of recycled Kevlar®/PET/LPET nonwoven geotextiles are examined, after which their influential levels are statistically determined by performing multiple regression analyses. According to the results, the tensile strength of nonwoven geotextiles can be enhanced by adding recycled Kevlar® fibers and thermal treatment.

  18. Preliminary In Vivo Experiments on Adhesion of Geckos

    Directory of Open Access Journals (Sweden)

    E. Lepore

    2008-01-01

    Full Text Available We performed preliminary experiments on the adhesion of a Tokay gecko on surfaces with different roughness, with or without particles with significant different granulometry, before/after or during the moult. The results were analyzed using the Weibull statistics.

  19. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  20. Births: Preliminary Data for 2011. National Vital Statistics Reports. Volume 61, Number 5

    Science.gov (United States)

    Hamilton, Brady E.; Martin, Joyce A.; Ventura, Stephanie J.

    2012-01-01

    Objectives: This report presents preliminary data for 2011 on births in the United States. U.S. data on births are shown by age, live-birth order, race, and Hispanic origin of mother. Data on marital status, cesarean delivery, preterm births, and low birthweight are also presented. Methods: Data in this report are based on approximately 100…

  1. Statistical parametric mapping and statistical probabilistic anatomical mapping analyses of basal/acetazolamide Tc-99m ECD brain SPECT for efficacy assessment of endovascular stent placement for middle cerebral artery stenosis

    International Nuclear Information System (INIS)

    Lee, Tae-Hong; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Kyung-Pil

    2007-01-01

    Statistical parametric mapping (SPM) and statistical probabilistic anatomical mapping (SPAM) were applied to basal/acetazolamide Tc-99m ECD brain perfusion SPECT images in patients with middle cerebral artery (MCA) stenosis to assess the efficacy of endovascular stenting of the MCA. Enrolled in the study were 11 patients (8 men and 3 women, mean age 54.2 ± 6.2 years) who had undergone endovascular stent placement for MCA stenosis. Using SPM and SPAM analyses, we compared the number of significant voxels and cerebral counts in basal and acetazolamide SPECT images before and after stenting, and assessed the perfusion changes and cerebral vascular reserve index (CVRI). The numbers of hypoperfusion voxels in SPECT images were decreased from 10,083 ± 8,326 to 4,531 ± 5,091 in basal images (P 0.0317) and from 13,398 ± 14,222 to 7,699 ± 10,199 in acetazolamide images (P = 0.0142) after MCA stenting. On SPAM analysis, the increases in cerebral counts were significant in acetazolamide images (90.9 ± 2.2 to 93.5 ± 2.3, P = 0.0098) but not in basal images (91 ± 2.7 to 92 ± 2.6, P = 0.1602). The CVRI also showed a statistically significant increase from before stenting (median 0.32; 95% CI -2.19-2.37) to after stenting (median 1.59; 95% CI -0.85-4.16; P = 0.0068). This study revealed the usefulness of voxel-based analysis of basal/acetazolamide brain perfusion SPECT after MCA stent placement. This study showed that SPM and SPAM analyses of basal/acetazolamide Tc-99m brain SPECT could be used to evaluate the short-term hemodynamic efficacy of successful MCA stent placement. (orig.)

  2. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  3. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  4. Dispensing processes impact apparent biological activity as determined by computational and statistical analyses.

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.

  5. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    Science.gov (United States)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  6. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  7. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  8. Modern applied statistics with S-plus

    CERN Document Server

    Venables, W N

    1994-01-01

    S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.

  9. Preliminary thermal/thermomechanical analyses of the Site Characterization Plan's Conceptual Design for a repository containing horizontally emplaced waste packages at the Deaf Smith County site

    International Nuclear Information System (INIS)

    Ghantous, N.Y.; Raines, G.E.

    1987-10-01

    This report presents thermal/thermomechanical analyses of the Site Characterization Plan Conceptual Design for horizontal package emplacement at the Deaf Smith County site, Texas. The repository was divided into three geometric regions. Then two-dimensional finite-element models were set up to approximate the three-dimensional nature of each region. Thermal and quasistatic thermomechanical finite-element analyses were performed to evaluate the thermal/thermomechanical responses of the three regions. The exponential-time creep law was used to represent the creep behavior of salt rock. The repository design was evaluated by comparing the thermal/thermomechanical responses obtained for the three regions with interim performance constraints. The preliminary results show that all the performance constraints are met except for those of the waste package. The following factors were considered in interpreting these results: (1) the qualitative description of the analytical responses; (2) the limitations of the analyses; and (3) either the conclusions based on overall evaluation of limitations and analytical results or the conclusions based on the fact that the repository design may be evaluated only after further analyses. Furthermore, a parametric analysis was performed to estimate the effect of material parameters on the predicted thermal/thermomechanical response. 23 refs., 34 figs., 9 tabs

  10. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  11. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  12. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  13. Structural brain alterations of Down's syndrome in early childhood evaluation by DTI and volumetric analyses

    International Nuclear Information System (INIS)

    Gunbey, Hediye Pinar; Bilgici, Meltem Ceyhan; Aslan, Kerim; Incesu, Lutfi; Has, Arzu Ceylan; Ogur, Methiye Gonul; Alhan, Aslihan

    2017-01-01

    To provide an initial assessment of white matter (WM) integrity with diffusion tensor imaging (DTI) and the accompanying volumetric changes in WM and grey matter (GM) through volumetric analyses of young children with Down's syndrome (DS). Ten children with DS and eight healthy control subjects were included in the study. Tract-based spatial statistics (TBSS) were used in the DTI study for whole-brain voxelwise analysis of fractional anisotropy (FA) and mean diffusivity (MD) of WM. Volumetric analyses were performed with an automated segmentation method to obtain regional measurements of cortical volumes. Children with DS showed significantly reduced FA in association tracts of the fronto-temporo-occipital regions as well as the corpus callosum (CC) and anterior limb of the internal capsule (p < 0.05). Volumetric reductions included total cortical GM, cerebellar GM and WM volume, basal ganglia, thalamus, brainstem and CC in DS compared with controls (p < 0.05). These preliminary results suggest that DTI and volumetric analyses may reflect the earliest complementary changes of the neurodevelopmental delay in children with DS and can serve as surrogate biomarkers of the specific elements of WM and GM integrity for cognitive development. (orig.)

  14. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  15. Safety culture and learning from incidents: the role of incident reporting and causal analyses

    International Nuclear Information System (INIS)

    Wilpert, B.

    1994-01-01

    Nuclear industry more than any other industrial branch has developed and used predictive risk analysis as a method of feedforward control of safety and reliability. Systematic evaluation of operating experience, statistical documentation of component failures, systematic documentation and analysis of incidents are important complementary elements of feedback control: we are dealing here with adjustment and learning from experience, in particular from past incidents. Using preliminary findings from ongoing research at the Research Center Systems Safety at the Berlin University of Technology the contribution discusses preconditions for an effective use of lessons to be learnt from closely matched incident reporting and in depth analyses of causal chains leading to incidents. Such conditions are especially standardized documentation, reporting and analyzing methods of incidents; structured information flows and feedback loops; abstaining from culpability search; mutual trust of employees and management; willingness of all concerned to continually evaluate and optimize the established learning system. Thus, incident related reporting and causal analyses contribute to safety culture, which is seen to emerge from tightly coupled organizational measures and respective change in attitudes and behaviour. (author) 2 figs., 7 refs

  16. Does environmental data collection need statistics?

    NARCIS (Netherlands)

    Pulles, M.P.J.

    1998-01-01

    The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental

  17. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  18. Preliminary Results of the Louisiana Sex Offender Treatment Program

    Directory of Open Access Journals (Sweden)

    Lee A. Underwood

    2015-12-01

    Full Text Available The purpose of this study was to offer preliminary support for the Louisiana Sex Offender Treatment Program (LSOTP in addressing the needs of juvenile sex offenders. Research objectives were (1 to offer statistical evidence for reductions in anxiety, depression, cognitive distortion and negative attitudes towards women comparing a group of 21 adolescents, 12 of whom received services as usual and nine of whom participated in the LSOTP. A controlled experimental evaluation design was utilized. The juvenile sex offenders were randomly assigned to the experimental group for 12 weeks receiving treatment services and a control group receiving care “as usual” in a residential group care program. Participants in the experimental group experienced statistically significant decreases in cognitive distortions related specifically to rape and molestation.The results of this study offer preliminary support of the LSOTP as a best practices alternative to other treatment modalities.

  19. Statistics for NAEG: past efforts, new results, and future plans

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.; Engel, D.W.

    1983-06-01

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given

  20. Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh

    transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...

  1. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  2. Statistical analyses of the performance of Macedonian investment and pension funds

    Directory of Open Access Journals (Sweden)

    Petar Taleski

    2015-10-01

    Full Text Available The foundation of the post-modern portfolio theory is creating a portfolio based on a desired target return. This specifically applies to the performance of investment and pension funds that provide a rate of return meeting payment requirements from investment funds. A desired target return is the goal of an investment or pension fund. It is the primary benchmark used to measure performances, dynamic monitoring and evaluation of the risk–return ratio on investment funds. The analysis in this paper is based on monthly returns of Macedonian investment and pension funds (June 2011 - June 2014. Such analysis utilizes the basic, but highly informative statistical characteristic moments like skewness, kurtosis, Jarque–Bera, and Chebyishev’s Inequality. The objective of this study is to perform a trough analysis, utilizing the above mentioned and other types of statistical techniques (Sharpe, Sortino, omega, upside potential, Calmar, Sterling to draw relevant conclusions regarding the risks and characteristic moments in Macedonian investment and pension funds. Pension funds are the second largest segment of the financial system, and has great potential for further growth due to constant inflows from pension insurance. The importance of investment funds for the financial system in the Republic of Macedonia is still small, although open-end investment funds have been the fastest growing segment of the financial system. Statistical analysis has shown that pension funds have delivered a significantly positive volatility-adjusted risk premium in the analyzed period more so than investment funds.

  3. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Science.gov (United States)

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  4. Neutronic analyses of the preliminary design of a DCLL blanket for the EUROfusion DEMO power plant

    Energy Technology Data Exchange (ETDEWEB)

    Palermo, Iole, E-mail: iole.palermo@ciemat.es; Fernández, Iván; Rapisarda, David; Ibarra, Angel

    2016-11-01

    Highlights: • We perform neutronic calculations for the preliminary DCLL Blanket design. • We study the tritium breeding capability of the reactor. • We determine the nuclear heating in the main components. • We verify if the shielding of the TF coil is maintained. - Abstract: In the frame of the newly established EUROfusion WPBB Project for the period 2014–2018, four breeding blanket options are being investigated to be used in the fusion power demonstration plant DEMO. CIEMAT is leading the development of the conceptual design of the Dual Coolant Lithium Lead, DCLL, breeding blanket. The primary role of the blanket is of energy extraction, tritium production, and radiation shielding. With this aim the DCLL uses LiPb as primary coolant, tritium breeder and neutron multiplier and Eurofer as structural material. Focusing on the achievement of the fundamental neutronic responses a preliminary blanket model has been designed. Thus detailed 3D neutronic models of the whole blanket modules have been generated, arranged in a specific DCLL segmentation and integrated in the generic DEMO model. The initial design has been studied to demonstrate its viability. Thus, the neutronic behaviour of the blanket and of the shield systems in terms of tritium breeding capabilities, power generation and shielding efficiency has been assessed in this paper. The results demonstrate that the primary nuclear performances are already satisfactory at this preliminary stage of the design, having obtained the tritium self-sufficiency and an adequate shielding.

  5. Preliminary sensitivity analyses of corrosion models for BWIP [Basalt Waste Isolation Project] container materials

    International Nuclear Information System (INIS)

    Anantatmula, R.P.

    1984-01-01

    A preliminary sensitivity analysis was performed for the corrosion models developed for Basalt Waste Isolation Project container materials. The models describe corrosion behavior of the candidate container materials (low carbon steel and Fe9Cr1Mo), in various environments that are expected in the vicinity of the waste package, by separate equations. The present sensitivity analysis yields an uncertainty in total uniform corrosion on the basis of assumed uncertainties in the parameters comprising the corrosion equations. Based on the sample scenario and the preliminary corrosion models, the uncertainty in total uniform corrosion of low carbon steel and Fe9Cr1Mo for the 1000 yr containment period are 20% and 15%, respectively. For containment periods ≥ 1000 yr, the uncertainty in corrosion during the post-closure aqueous periods controls the uncertainty in total uniform corrosion for both low carbon steel and Fe9Cr1Mo. The key parameters controlling the corrosion behavior of candidate container materials are temperature, radiation, groundwater species, etc. Tests are planned in the Basalt Waste Isolation Project containment materials test program to determine in detail the sensitivity of corrosion to these parameters. We also plan to expand the sensitivity analysis to include sensitivity coefficients and other parameters in future studies. 6 refs., 3 figs., 9 tabs

  6. Notices about using elementary statistics in psychology

    OpenAIRE

    松田, 文子; 三宅, 幹子; 橋本, 優花里; 山崎, 理央; 森田, 愛子; 小嶋, 佳子

    2003-01-01

    Improper uses of elementary statistics that were often observed in beginners' manuscripts and papers were collected and better ways were suggested. This paper consists of three parts: About descriptive statistics, multivariate analyses, and statistical tests.

  7. Statistical studies on quasars and active nuclei of galaxies

    International Nuclear Information System (INIS)

    Stasinska, G.

    1987-01-01

    A catalogue of optical, radio and X-ray properties of quasars and other active galactic nuclei, now in elaboration, is presented. This catalogue may serve as a data base for statistical studies. As an example, we give some preliminary results concerning the determination of the quasar masses [fr

  8. Preliminary study to characterize plastic polymers using elemental analyser/isotope ratio mass spectrometry (EA/IRMS).

    Science.gov (United States)

    Berto, Daniela; Rampazzo, Federico; Gion, Claudia; Noventa, Seta; Ronchi, Francesca; Traldi, Umberto; Giorgi, Giordano; Cicero, Anna Maria; Giovanardi, Otello

    2017-06-01

    Plastic waste is a growing global environmental problem, particularly in the marine ecosystems, in consideration of its persistence. The monitoring of the plastic waste has become a global issue, as reported by several surveillance guidelines proposed by Regional Sea Conventions (OSPAR, UNEP) and appointed by the EU Marine Strategy Framework Directive. Policy responses to plastic waste vary at many levels, ranging from beach clean-up to bans on the commercialization of plastic bags and to Regional Plans for waste management and recycling. Moreover, in recent years, the production of plant-derived biodegradable plastic polymers has assumed increasing importance. This study reports the first preliminary characterization of carbon stable isotopes (δ 13 C) of different plastic polymers (petroleum- and plant-derived) in order to increase the dataset of isotopic values as a tool for further investigation in different fields of polymers research as well as in the marine environment surveillance. The δ 13 C values determined in different packaging for food uses reflect the plant origin of "BIO" materials, whereas the recycled plastic materials displayed a δ 13 C signatures between plant- and petroleum-derived polymers source. In a preliminary estimation, the different colours of plastic did not affect the variability of δ 13 C values, whereas the abiotic and biotic degradation processes that occurred in the plastic materials collected on beaches and in seawater, showed less negative δ 13 C values. A preliminary experimental field test confirmed these results. The advantages offered by isotope ratio mass spectrometry with respect to other analytical methods used to characterize the composition of plastic polymers are: high sensitivity, small amount of material required, rapidity of analysis, low cost and no limitation in black/dark samples compared with spectroscopic analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Integrating Statistical Visualization Research into the Political Science Classroom

    Science.gov (United States)

    Draper, Geoffrey M.; Liu, Baodong; Riesenfeld, Richard F.

    2011-01-01

    The use of computer software to facilitate learning in political science courses is well established. However, the statistical software packages used in many political science courses can be difficult to use and counter-intuitive. We describe the results of a preliminary user study suggesting that visually-oriented analysis software can help…

  10. The mobility of Atlantic baric depressions leading to intense precipitation over Italy: a preliminary statistical analysis

    Directory of Open Access Journals (Sweden)

    N. Tartaglione

    2006-01-01

    Full Text Available The speed of Atlantic surface depressions, occurred during the autumn and winter seasons and that lead to intense precipitation over Italy from 1951 to 2000, was investigated. Italy was divided into 5 regions as documented in previous climatological studies (based on Principal Component Analysis. Intense precipitation events were selected on the basis of in situ rain gauge data and clustered according to the region that they hit. For each intense precipitation event we tried to identify an associated surface depression and we tracked it, within a large domain covering the Mediterranean and Atlantic regions, from its formation to cyclolysis in order to estimate its speed. 'Depression speeds' were estimated with 6-h resolution and clustered into slow and non-slow classes by means of a threshold, coinciding with the first quartile of speed distribution and depression centre speeds were associated with their positions. Slow speeds occurring over an area including Italy and the western Mediterranean basin showed frequencies higher than 25%, for all the Italian regions but one. The probability of obtaining by chance the observed more than 25% success rate was estimated by means of a binomial distribution. The statistical reliability of the result is confirmed for only one region. For Italy as a whole, results were confirmed at 95% confidence level. Stability of the statistical inference, with respect to errors in estimating depression speed and changes in the threshold of slow depressions, was analysed and essentially confirmed the previous results.

  11. Structural brain alterations of Down's syndrome in early childhood evaluation by DTI and volumetric analyses

    Energy Technology Data Exchange (ETDEWEB)

    Gunbey, Hediye Pinar; Bilgici, Meltem Ceyhan; Aslan, Kerim; Incesu, Lutfi [Ondokuz Mayis University, Faculty of Medicine, Department of Radiology, Kurupelit, Samsun (Turkey); Has, Arzu Ceylan [Bilkent University, National Magnetic Resonance Research Center, Ankara (Turkey); Ogur, Methiye Gonul [Ondokuz Mayis University, Department of Genetics, Samsun (Turkey); Alhan, Aslihan [Ufuk University, Department of Statistics, Ankara (Turkey)

    2017-07-15

    To provide an initial assessment of white matter (WM) integrity with diffusion tensor imaging (DTI) and the accompanying volumetric changes in WM and grey matter (GM) through volumetric analyses of young children with Down's syndrome (DS). Ten children with DS and eight healthy control subjects were included in the study. Tract-based spatial statistics (TBSS) were used in the DTI study for whole-brain voxelwise analysis of fractional anisotropy (FA) and mean diffusivity (MD) of WM. Volumetric analyses were performed with an automated segmentation method to obtain regional measurements of cortical volumes. Children with DS showed significantly reduced FA in association tracts of the fronto-temporo-occipital regions as well as the corpus callosum (CC) and anterior limb of the internal capsule (p < 0.05). Volumetric reductions included total cortical GM, cerebellar GM and WM volume, basal ganglia, thalamus, brainstem and CC in DS compared with controls (p < 0.05). These preliminary results suggest that DTI and volumetric analyses may reflect the earliest complementary changes of the neurodevelopmental delay in children with DS and can serve as surrogate biomarkers of the specific elements of WM and GM integrity for cognitive development. (orig.)

  12. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  13. Preliminary Test for Nonlinear Input Output Relations in SISO Systems

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2000-01-01

    This paper discusses and develops preliminary statistical tests for detecting nonlinearities in the deterministic part of SISO systems with noise. The most referenced method is unreliable for common noise processes as e.g.\\ colored. Therefore two new methods based on superposition and sinus input...

  14. Modern applied statistics with s-plus

    CERN Document Server

    Venables, W N

    1997-01-01

    S-PLUS is a powerful environment for the statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-PLUS to perform statistical analyses and provides both an introduction to the use of S-PLUS and a course in modern statistical methods. S-PLUS is available for both Windows and UNIX workstations, and both versions are covered in depth. The aim of the book is to show how to use S-PLUS as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-PLUS, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets. Many of the methods discussed are state-of-the-art approaches to topics such as linear and non-linear regression models, robust a...

  15. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    Science.gov (United States)

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  16. Testing statistical isotropy in cosmic microwave background polarization maps

    Science.gov (United States)

    Rath, Pranati K.; Samal, Pramoda Kumar; Panda, Srikanta; Mishra, Debesh D.; Aluri, Pavan K.

    2018-04-01

    We apply our symmetry based Power tensor technique to test conformity of PLANCK Polarization maps with statistical isotropy. On a wide range of angular scales (l = 40 - 150), our preliminary analysis detects many statistically anisotropic multipoles in foreground cleaned full sky PLANCK polarization maps viz., COMMANDER and NILC. We also study the effect of residual foregrounds that may still be present in the Galactic plane using both common UPB77 polarization mask, as well as the individual component separation method specific polarization masks. However, some of the statistically anisotropic modes still persist, albeit significantly in NILC map. We further probed the data for any coherent alignments across multipoles in several bins from the chosen multipole range.

  17. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  18. Authigenic oxide Neodymium Isotopic composition as a proxy of seawater: applying multivariate statistical analyses.

    Science.gov (United States)

    McKinley, C. C.; Scudder, R.; Thomas, D. J.

    2016-12-01

    The Neodymium Isotopic composition (Nd IC) of oxide coatings has been applied as a tracer of water mass composition and used to address fundamental questions about past ocean conditions. The leached authigenic oxide coating from marine sediment is widely assumed to reflect the dissolved trace metal composition of the bottom water interacting with sediment at the seafloor. However, recent studies have shown that readily reducible sediment components, in addition to trace metal fluxes from the pore water, are incorporated into the bottom water, influencing the trace metal composition of leached oxide coatings. This challenges the prevailing application of the authigenic oxide Nd IC as a proxy of seawater composition. Therefore, it is important to identify the component end-members that create sediments of different lithology and determine if, or how they might contribute to the Nd IC of oxide coatings. To investigate lithologic influence on the results of sequential leaching, we selected two sites with complete bulk sediment statistical characterization. Site U1370 in the South Pacific Gyre, is predominantly composed of Rhyolite ( 60%) and has a distinguishable ( 10%) Fe-Mn Oxyhydroxide component (Dunlea et al., 2015). Site 1149 near the Izu-Bonin-Arc is predominantly composed of dispersed ash ( 20-50%) and eolian dust from Asia ( 50-80%) (Scudder et al., 2014). We perform a two-step leaching procedure: a 14 mL of 0.02 M hydroxylamine hydrochloride (HH) in 20% acetic acid buffered to a pH 4 for one hour, targeting metals bound to Fe- and Mn- oxides fractions, and a second HH leach for 12 hours, designed to remove any remaining oxides from the residual component. We analyze all three resulting fractions for a large suite of major, trace and rare earth elements, a sub-set of the samples are also analyzed for Nd IC. We use multivariate statistical analyses of the resulting geochemical data to identify how each component of the sediment partitions across the sequential

  19. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  20. 47 CFR 1.363 - Introduction of statistical data.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall be...

  1. WATER POLO GAME-RELATED STATISTICS IN WOMEN'S INTERNATIONAL CHAMPIONSHIPS: DIFFERENCES AND DISCRIMINATORY POWER

    Directory of Open Access Journals (Sweden)

    Yolanda Escalante

    2012-09-01

    Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots

  2. The Use of Statistical Process Control Tools for Analysing Financial Statements

    Directory of Open Access Journals (Sweden)

    Niezgoda Janusz

    2017-06-01

    Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.

  3. Inferring the origin of rare fruit distillates from compositional data using multivariate statistical analyses and the identification of new flavour constituents.

    Science.gov (United States)

    Mihajilov-Krstev, Tatjana M; Denić, Marija S; Zlatković, Bojan K; Stankov-Jovanović, Vesna P; Mitić, Violeta D; Stojanović, Gordana S; Radulović, Niko S

    2015-04-01

    In Serbia, delicatessen fruit alcoholic drinks are produced from autochthonous fruit-bearing species such as cornelian cherry, blackberry, elderberry, wild strawberry, European wild apple, European blueberry and blackthorn fruits. There are no chemical data on many of these and herein we analysed volatile minor constituents of these rare fruit distillates. Our second goal was to determine possible chemical markers of these distillates through a statistical/multivariate treatment of the herein obtained and previously reported data. Detailed chemical analyses revealed a complex volatile profile of all studied fruit distillates with 371 identified compounds. A number of constituents were recognised as marker compounds for a particular distillate. Moreover, 33 of them represent newly detected flavour constituents in alcoholic beverages or, in general, in foodstuffs. With the aid of multivariate analyses, these volatile profiles were successfully exploited to infer the origin of raw materials used in the production of these spirits. It was also shown that all fruit distillates possessed weak antimicrobial properties. It seems that the aroma of these highly esteemed wild-fruit spirits depends on the subtle balance of various minor volatile compounds, whereby some of them are specific to a certain type of fruit distillate and enable their mutual distinction. © 2014 Society of Chemical Industry.

  4. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  5. Sewage Solids Irradiator Transportation System (SSITS) cask: preliminary design description

    International Nuclear Information System (INIS)

    Eakes, R.G.; Kempka, S.N.; Lamoreaux, G.H.; Sutherland, S.H.

    1983-02-01

    The preliminary design of the Sewage Solids Irradiator Transportation System (SSITS) Cask is presented in this document. The SSITS cask is to be used for the transport of radioactive cesium chloride and strontium fluoride capsules which are of use in irradiators or as heat sources. The SSITS cask is approximately 1.4 m in diameter, 1.3 m high, weighs roughly 9 t, provides 33 cm of steel shielding, and can dissipate up to 5.2 kW of decay heat. The cask design criteria are identified and a description of the cask design and operation is provided. Detailed analyses of the design were performed to demonstrate licensability of the cask by the Nuclear Regulatory Commission (NRC). Results of the analyses indicate that the preliminary design is in compliance with the pertinent regulatory requirements for licensing of a radioactive material transportation container

  6. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  7. Practical Recommendations for the Preliminary Design Analysis of ...

    African Journals Online (AJOL)

    Interior-to-exterior shear ratios for equal and unequal bay frames, as well as column inflection points were obtained to serve as practical aids for preliminary analysis/design of fixed-feet multistory sway frames. Equal and unequal bay five story frames were analysed to show the validity of the recommended design ...

  8. First-Generation Transgenic Plants and Statistics

    NARCIS (Netherlands)

    Nap, Jan-Peter; Keizer, Paul; Jansen, Ritsert

    1993-01-01

    The statistical analyses of populations of first-generation transgenic plants are commonly based on mean and variance and generally require a test of normality. Since in many cases the assumptions of normality are not met, analyses can result in erroneous conclusions. Transformation of data to

  9. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  10. Consumer Loyalty and Loyalty Programs: a topographic examination of the scientific literature using bibliometrics, spatial statistics and network analyses

    Directory of Open Access Journals (Sweden)

    Viviane Moura Rocha

    2015-04-01

    Full Text Available This paper presents a topographic analysis of the fields of consumer loyalty and loyalty programs, vastly studied in the last decades and still relevant in the marketing literature. After the identification of 250 scientific papers that were published in the last ten years in indexed journals, a subset of 76 were chosen and their 3223 references were extracted. The journals in which these papers were published, their key words, abstracts, authors, institutions of origin and citation patterns were identified and analyzed using bibliometrics, spatial statistics techniques and network analyses. The results allow the identification of the central components of the field, as well as its main authors, journals, institutions and countries that intermediate the diffusion of knowledge, which contributes to the understanding of the constitution of the field by researchers and students.

  11. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  12. Multivariate statistical methods a first course

    CERN Document Server

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  13. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    Science.gov (United States)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  14. Surface Properties of TNOs: Preliminary Statistical Analysis

    Science.gov (United States)

    Antonieta Barucci, Maria; Fornasier, S.; Alvarez-Cantal, A.; de Bergh, C.; Merlin, F.; DeMeo, F.; Dumas, C.

    2009-09-01

    An overview of the surface properties based on the last results obtained during the Large Program performed at ESO-VLT (2007-2008) will be presented. Simultaneous high quality visible and near-infrared spectroscopy and photometry have been carried out on 40 objects with various dynamical properties, using FORS1 (V), ISAAC (J) and SINFONI (H+K bands) mounted respectively at UT2, UT1 and UT4 VLT-ESO telescopes (Cerro Paranal, Chile). For spectroscopy we computed the spectral slope for each object and searched for possible rotational inhomogeneities. A few objects show features in their visible spectra such as Eris, whose spectral bands are displaced with respect to pure methane-ice. We identify new faint absorption features on 10199 Chariklo and 42355 Typhon, possibly due to the presence of aqueous altered materials. The H+K band spectroscopy was performed with the new instrument SINFONI which is a 3D integral field spectrometer. While some objects show no diagnostic spectral bands, others reveal surface deposits of ices of H2O, CH3OH, CH4, and N2. To investigate the surface properties of these bodies, a radiative transfer model has been applied to interpret the entire 0.4-2.4 micron spectral region. The diversity of the spectra suggests that these objects represent a substantial range of bulk compositions. These different surface compositions can be diagnostic of original compositional diversity, interior source and/or different evolution with different physical processes affecting the surfaces. A statistical analysis is in progress to investigate the correlation of the TNOs’ surface properties with size and dynamical properties.

  15. Statistical criteria for characterizing irradiance time series.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  16. Statistic analyses of the color experience according to the age of the observer.

    Science.gov (United States)

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).

  17. Computer simulation of HTGR fuel microspheres using a Monte-Carlo statistical approach

    International Nuclear Information System (INIS)

    Hedrick, C.E.

    1976-01-01

    The concept and computational aspects of a Monte-Carlo statistical approach in relating structure of HTGR fuel microspheres to the uranium content of fuel samples have been verified. Results of the preliminary validation tests and the benefits to be derived from the program are summarized

  18. A Statistical Model for Seasonal Rainfall Forecasting over the ...

    African Journals Online (AJOL)

    In a preliminary step, in order to identify the most influential rainfall predictor, a correlation matrix and step-wise regression of 10 predictors with different lags were analysed. The influence of the southern Indian Ocean Sea Surface Temperature was identified as the most influential predictor for the highland of Eritrea.

  19. A preliminary study on the relevancy of sustainable building design ...

    African Journals Online (AJOL)

    This preliminary study aims to explore the relationship between sustainable building design paradigms and commercial property depreciation, to assist in the understanding of sustainable building design impact towards commercial building value and rental de employs the qualitative method and analyses valuers' current ...

  20. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  1. Practical Statistics for Environmental and Biological Scientists

    CERN Document Server

    Townend, John

    2012-01-01

    All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp

  2. Preliminary analysis of a 1:4 scale prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Luk, V.K.; Hessheimer, M.F.

    1997-01-01

    Sandia National Laboratories is conducting a research program to investigate the integrity of nuclear containment structures. As part of the program Sandia will construct an instrumented 1:4 scale model of a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR), which will be pressure tested up to its ultimate capacity. One of the key program objectives is to develop validated methods to predict the structural performance of containment vessels when subjected to beyond design basis loadings. Analytical prediction of structural performance requires a stepwise, systematic approach that addresses all potential failure modes. The analysis effort includes two and three-dimensional nonlinear finite element analyses of the PCCV test model to evaluate its structural performance under very high internal pressurization. Such analyses have been performed using the nonlinear concrete constitutive model, ANACAP-U, in conjunction with the ABAQUS general purpose finite element code. The analysis effort is carried out in three phases: preliminary analysis; pretest prediction; and post-test data interpretation and analysis evaluation. The preliminary analysis phase serves to provide instrumentation support and identify candidate failure modes. The associated tasks include the preliminary prediction of failure pressure and probable failure locations and the development of models to be used in the detailed failure analyses. This paper describes the modeling approaches and some of the results obtained in the first phase of the analysis effort

  3. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  4. Licensing Support System: Preliminary data scope analysis

    International Nuclear Information System (INIS)

    1989-01-01

    The purpose of this analysis is to determine the content and scope of the Licensing Support System (LSS) data base. Both user needs and currently available data bases that, at least in part, address those needs have been analyzed. This analysis, together with the Preliminary Needs Analysis (DOE, 1988d) is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. These reports are preliminary. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This document provides a baseline for what is known at this time. Additional analyses, currently being conducted, will provide more precise information on the content and scope of the LSS data base. 23 refs., 4 figs., 8 tabs

  5. Preliminary results of an oilspill risk analysis for the Bombay High Region

    Digital Repository Service at National Institute of Oceanography (India)

    Mascarenhas, A.A.M.Q.; Gouveia, A.D.; Sitaraman, R.

    oil were analysed, covering estimates of the time between spill occurrence and contact with resources. The combined results yielded estimates of the overall risks associated with production within the developmental area. Preliminary results...

  6. EXPLOSION POTENTIAL ASSESSMENT OF HEAT EXCHANGER NETWORK AT THE PRELIMINARY DESIGN STAGE

    Directory of Open Access Journals (Sweden)

    MOHSIN PASHA

    2016-07-01

    Full Text Available The failure of Shell and Tube Heat Exchangers (STHE is being extensively observed in the chemical process industries. This failure can cause enormous production loss and have a potential of dangerous consequences such as an explosion, fire and toxic release scenarios. There is an urgent need for assessing the explosion potential of shell and tube heat exchanger at the preliminary design stage. In current work, inherent safety index based approach is used to resolve the highlighted issue. Inherent Safety Index for Shell and Tube Heat Exchanger (ISISTHE is a newly developed index for assessing the inherent safety level of a STHE at the preliminary design stage. This index is composed of preliminary design variables and integrated with the process design simulator (Aspen HYSYS. Process information can easily be transferred from process design simulator to MS Excel spreadsheet owing to this integration. This index could potentially facilitate the design engineer to analyse the worst heat exchanger in the heat exchanger network. Typical heat exchanger network of the steam reforming process is presented as a case study and the worst heat exchanger of this network has been identified. It is inferred from this analysis that shell and tube heat exchangers possess high operating pressure, corrected mean temperature difference (CMTD and flammability and reactive potential needs to be critically analysed at the preliminary design stage.

  7. Psychophysiological deficits in young adolescents with psychosis or ADHD: Preliminary findings

    DEFF Research Database (Denmark)

    Rydkjær, Jacob; Jepsen, Jens Richardt Møllegaard; Fagerlund, Birgitte

    add valuable information on how to differentiate premature stages of early onset psychosis from ADHD. Aim: To characterize psychophysiological deficits in young adolescents with psychosis or ADHD and compare the profiles of impariments between the two groups. Materials and methods: A cohort of young...... and low intensity prepulse trials, Mismatch Negativity (MMN), Selective Attention (SA) and P50. Results: Preliminary analyses of 18 patients with psychosis and 12 patients with ADHD showed significantly less PPI in the higher intensity prepulse trials in the psychosis group than in the ADHD group....... No significant group difference was found in the lower intensity prepulse trials. Conclusion: The preliminary results indicate lower levels of PPI in adolescents with early onset psychosis than in young patients with ADHD. If these results hold in the final analyses then this knowledge may contribute to better...

  8. APPLYING SPECTROSCOPIC METHODS ON ANALYSES OF HAZARDOUS WASTE

    OpenAIRE

    Dobrinić, Julijan; Kunić, Marija; Ciganj, Zlatko

    2000-01-01

    Abstract The paper presents results of measuring the content of heavy and other metals in waste samples from the hazardous waste disposal site of Sovjak near Rijeka. The preliminary design elaboration and the choice of the waste disposal sanification technology were preceded by the sampling and physico-chemical analyses of disposed waste, enabling its categorization. The following spectroscopic methods were applied on metal content analysis: Atomic absorption spectroscopy (AAS) and plas...

  9. Statistical methods and applications from a historical perspective selected issues

    CERN Document Server

    Mignani, Stefania

    2014-01-01

    The book showcases a selection of peer-reviewed papers, the preliminary versions of which were presented at a conference held 11-13 June 2011 in Bologna and organized jointly by the Italian Statistical Society (SIS), the National Institute of Statistics (ISTAT) and the Bank of Italy. The theme of the conference was "Statistics in the 150 years of the Unification of Italy." The celebration of the anniversary of Italian unification provided the opportunity to examine and discuss the methodological aspects and applications from a historical perspective and both from a national and international point of view. The critical discussion on the issues of the past has made it possible to focus on recent advances, considering the studies of socio-economic and demographic changes in European countries.

  10. The semi-empirical low-level background statistics

    International Nuclear Information System (INIS)

    Tran Manh Toan; Nguyen Trieu Tu

    1992-01-01

    A semi-empirical low-level background statistics was proposed. The one can be applied to evaluated the sensitivity of low background systems, and to analyse the statistical error, the 'Rejection' and 'Accordance' criteria for processing of low-level experimental data. (author). 5 refs, 1 figs

  11. Statistical Analysis of Deccan Basalt Geochemistry: An Updated Look at Deccan Chemostratigraphy

    Science.gov (United States)

    Vanderkluysen, L.; Barber, N.; Woloszynek, S.; O'Connor, M. P.; Mittal, T.; Sealing, C. R.; Sprain, C. J.; Renne, P. R.

    2017-12-01

    The Deccan Traps are a continental Large Igneous Province covering large swaths of west-central India, with onshore erupted lava volumes that may have exceeded one million cubic kilometers. Although the total duration of magmatism is a matter of debate, recent geochronological work has demonstrated that the vast majority of volcanism occurred in a short (architecture, temporal evolution, and feeder system. However, the usefulness of the chemostratigraphy has been put into doubt when expanding it beyond the type sections of the Western Ghats, and the validity of interpreting units as true chronological markers has been questioned. The original statistical analysis focused on elements readily available via X-ray fluorescence: SiO2, Al2O3, TiO2, CaO, K2O, P2O5, Ni, Ba, Sr, Zr, and Nb. However, issues caused by variable degrees of alteration and, particularly, fractional crystallization, have not been addressed, which has limited the predictive power of the geochemical clusters as currently defined. Here, we propose a modernization of the chemostratigraphic scheme that takes into account a much greater suite of elements now commonly analyzed, thanks to advances in analytical capabilities. We present preliminary results of statistical analyses of an updated Deccan sample database, discussing random forests and classification and regression trees as the basis for a more robust chemostratigraphy of Deccan lavas.

  12. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  13. Using R-Project for Free Statistical Analysis in Extension Research

    Science.gov (United States)

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  14. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  15. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 contains an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.

  16. Preliminary Seismic Response and Fragility Analysis for DACS Cabinet

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jinho; Kwag, Shinyoung; Lee, Jongmin; Kim, Youngki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    A DACS cabinet is installed in the main control room. The objective of this paper is to perform seismic analyses and evaluate the preliminary structural integrity and seismic capacity of the DACS cabinet. For this purpose, a 3-D finite element model of the DACS cabinet was developed and its modal analyses are carried out to analyze the dynamic characteristics. The response spectrum analyses and the related safety evaluation are then performed for the DACS cabinet subject to seismic loads. Finally, the seismic margin and seismic fragility of the DACS cabinet are investigated. A seismic analysis and preliminary structural integrity of the DACS cabinet under self weight and SSE load have been evaluated. For this purpose, 3-D finite element models of the DACS cabinet were developed. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. Therefore, it is concluded that the DACS cabinet was safely designed in that no damage to the preliminary structural integrity and sufficient seismic margin is expected.

  17. Preliminary Seismic Response and Fragility Analysis for DACS Cabinet

    International Nuclear Information System (INIS)

    Oh, Jinho; Kwag, Shinyoung; Lee, Jongmin; Kim, Youngki

    2013-01-01

    A DACS cabinet is installed in the main control room. The objective of this paper is to perform seismic analyses and evaluate the preliminary structural integrity and seismic capacity of the DACS cabinet. For this purpose, a 3-D finite element model of the DACS cabinet was developed and its modal analyses are carried out to analyze the dynamic characteristics. The response spectrum analyses and the related safety evaluation are then performed for the DACS cabinet subject to seismic loads. Finally, the seismic margin and seismic fragility of the DACS cabinet are investigated. A seismic analysis and preliminary structural integrity of the DACS cabinet under self weight and SSE load have been evaluated. For this purpose, 3-D finite element models of the DACS cabinet were developed. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. A modal analysis, response spectrum analysis, and seismic fragility analysis were then performed. From the structural analysis results, the DACS cabinet is below the structural design limit of under SSE 0.3g, and can structurally withstand until less than SSE 3g based on an evaluation of the maximum effective stresses. The HCLPF capacity for the DGRS of the SSE 0.3g is 0.55g. Therefore, it is concluded that the DACS cabinet was safely designed in that no damage to the preliminary structural integrity and sufficient seismic margin is expected

  18. Design and implementation of a modular program system for the carrying-through of statistical analyses

    International Nuclear Information System (INIS)

    Beck, W.

    1984-01-01

    From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de

  19. Boxing and mixed martial arts: preliminary traumatic neuromechanical injury risk analyses from laboratory impact dosage data.

    Science.gov (United States)

    Bartsch, Adam J; Benzel, Edward C; Miele, Vincent J; Morr, Douglas R; Prakash, Vikas

    2012-05-01

    In spite of ample literature pointing to rotational and combined impact dosage being key contributors to head and neck injury, boxing and mixed martial arts (MMA) padding is still designed to primarily reduce cranium linear acceleration. The objects of this study were to quantify preliminary linear and rotational head impact dosage for selected boxing and MMA padding in response to hook punches; compute theoretical skull, brain, and neck injury risk metrics; and statistically compare the protective effect of various glove and head padding conditions. An instrumented Hybrid III 50th percentile anthropomorphic test device (ATD) was struck in 54 pendulum impacts replicating hook punches at low (27-29 J) and high (54-58 J) energy. Five padding combinations were examined: unpadded (control), MMA glove-unpadded head, boxing glove-unpadded head, unpadded pendulum-boxing headgear, and boxing glove-boxing headgear. A total of 17 injury risk parameters were measured or calculated. All padding conditions reduced linear impact dosage. Other parameters significantly decreased, significantly increased, or were unaffected depending on padding condition. Of real-world conditions (MMA glove-bare head, boxing glove-bare head, and boxing glove-headgear), the boxing glove-headgear condition showed the most meaningful reduction in most of the parameters. In equivalent impacts, the MMA glove-bare head condition induced higher rotational dosage than the boxing glove-bare head condition. Finite element analysis indicated a risk of brain strain injury in spite of significant reduction of linear impact dosage. In the replicated hook punch impacts, all padding conditions reduced linear but not rotational impact dosage. Head and neck dosage theoretically accumulates fastest in MMA and boxing bouts without use of protective headgear. The boxing glove-headgear condition provided the best overall reduction in impact dosage. More work is needed to develop improved protective padding to minimize

  20. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    International Nuclear Information System (INIS)

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-01-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180 degrees and 120 degrees symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements

  1. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  2. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    number of zeros. Using QQ plots these data characteristics show a lack of normality from the data after contamination. Normality is improved when looking at log(CFU/cm2). Variance component analysis (VCA) and analysis of variance (ANOVA) were used to estimate the amount of variance due to each source and to determine which sources of variability were statistically significant. In general, the sampling methods interacted with the across event variability and with the across room variability. For this reason, it was decided to do analyses for each sampling method, individually. The between event variability and between room variability were significant for each method, except for the between event variability for the swabs. For both the wipes and vacuums, the within room standard deviation was much larger (26.9 for wipes and 7.086 for vacuums) than the between event standard deviation (6.552 for wipes and 1.348 for vacuums) and the between room standard deviation (6.783 for wipes and 1.040 for vacuums). Swabs between room standard deviation was 0.151, while both the within room and between event standard deviations are less than 0.10 (all measurements in CFU/cm2).

  3. Test for Nonlinear Input Output Relations in SISO Systems by Preliminary Data Analysis

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2000-01-01

    This paper discusses and develops preliminary statistical tests for detecting nonlinearities in the deterministic part of SISO systems with noise. The most referenced method is unreliable for common noise processes as e.g.\\ colored. Therefore two new methods based on superposition and sinus input...

  4. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  5. Cost Finding Principles and Procedures. Preliminary Field Review Edition. Technical Report 26.

    Science.gov (United States)

    Ziemer, Gordon; And Others

    This report is part of the Larger Cost Finding Principles Project designed to develop a uniform set of standards, definitions, and alternative procedures that will use accounting and statistical data to find the full cost of resources utilized in the process of producing institutional outputs. This technical report describes preliminary procedures…

  6. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  7. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    Science.gov (United States)

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We

  8. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  9. Preliminary Optical And Electric Field Pulse Statistics From Storm Overflights During The Altus Cumulus Electrification Study

    Science.gov (United States)

    Mach, D. A.; Blakeslee, R. J.; Bailey, J. C.; Farrell, W. M.; Goldberg, R. A.; Desch, M. D.; Houser, J. G.

    2003-01-01

    The Altus Cumulus Electrification Study (ACES) was conducted during the month of August, 2002 in an area near Key West, Florida. One of the goals of this uninhabited aerial vehicle (UAV) study was to collect high resolution optical pulse and electric field data from thunderstorms. During the month long campaign, we acquired 5294 lightning generated optical pulses with associated electric field changes. Most of these observations were made while close to the top of the storms. We found filtered mean and median 10-10% optical pulse widths of 875 and 830 microns respectively while the 50-50% mean and median optical pulse widths are 422 and 365 microns respectively. These values are similar to previous results as are the 10-90% mean and median rise times of 327 and 265 microns. The peak electrical to optical pulse delay mean and median were 209 and 145 microns which is longer than one would expect from theoretical results. The results of the pulse analysis will contribute to further validation of the Optical Transient Detector (OTD) and the Lightning Imaging Sensor (LIS) satellites. Pre-launch estimates of the flash detection efficiency were based on a small sample of optical pulse measurements associated with less than 350 lightning discharges collected by NASA U-2 aircraft in the early 1980s. Preliminary analyses of the ACES measurements show that we have greatly increased the number of optical pulses available for validation of the LIS and other orbital lightning optical sensors. Since the Altus was often close to the cloud tops, many of the optical pulses are from low-energy pulses. From these low-energy pulses, we can determine the fraction of optical lightning pulses below the thresholds of LIS, OTD, and any future satellite-based optical sensors such as the geostationary Lightning Mapping Sensor.

  10. Simplified methods and application to preliminary design of piping for elevated temperature service

    International Nuclear Information System (INIS)

    Severud, L.K.

    1975-01-01

    A number of simplified stress analysis methods and procedures that have been used on the FFTF project for preliminary design of piping operating at elevated temperatures are described. The rationale and considerations involved in developing the procedures and preliminary design guidelines are given. Applications of the simplified methods to a few FFTF pipelines are described and the success of these guidelines are measured by means of comparisons to pipeline designs that have had detailed Code type stress analyses. (U.S.)

  11. Preliminary design report: Babcock and Wilcox BR-100 100-ton rail/barge spent fuel shipping cask

    International Nuclear Information System (INIS)

    1990-02-01

    The purpose of this document is to provide information on burnup credit as applied to the preliminary design of the BR-100 shipping cask. There is a brief description of the preliminary basket design and the features used to maintain a critically safe system. Following the basket description is a discussion of various criticality analyses used to evaluate burnup credit. The results from these analyses are then reviewed in the perspective of fuel burnups expected to be shipped to either the final repository or a Monitored Retrievable Storage (MRS) facility. The hurdles to employing burnup credit in the certification of any cask are then outlines and reviewed. the last section gives conclusions reached as to burnup credit for the BR-100 cask, based on our analyses and experience. All information in this study refers to the cask configured to transport PWR fuel. Boiling Water Reactor (BWR) fuel satisfies the criticality requirements so that burnup credit is not needed. All calculations generated in the preparation of this report were based upon the preliminary design which will be optimized during the final design. 8 refs., 19 figs., 16 tabs

  12. Fluctuations of Lake Orta water levels: preliminary analyses

    Directory of Open Access Journals (Sweden)

    Helmi Saidi

    2016-04-01

    Full Text Available While the effects of past industrial pollution on the chemistry and biology of Lake Orta have been well documented, annual and seasonal fluctuations of lake levels have not yet been studied. Considering their potential impacts on both the ecosystem and on human safety, fluctuations in lake levels are an important aspect of limnological research. In the enormous catchment of Lake Maggiore, there are many rivers and lakes, and the amount of annual precipitation is both high and concentrated in spring and autumn. This has produced major flood events, most recently in November 2014. Flood events are also frequent on Lake Orta, occurring roughly triennially since 1917. The 1926, 1951, 1976 and 2014 floods were severe, with lake levels raised from 2.30 m to 3.46 m above the hydrometric zero. The most important event occurred in 1976, with a maximum level equal to 292.31 m asl and a return period of 147 years. In 2014 the lake level reached 291.89 m asl and its return period was 54 years. In this study, we defined trends and temporal fluctuations in Lake Orta water levels from 1917 to 2014, focusing on extremes. We report both annual maximum and seasonal variations of the lake water levels over this period. Both Mann-Kendall trend tests and simple linear regression were utilized to detect monotonic trends in annual and seasonal extremes, and logistic regression was used to detect trends in the number of flood events. Lake level decreased during winter and summer seasons, and a small but statistically non-significant positive trend was found in the number of flood events over the period. We provide estimations of return period for lake levels, a metric which could be used in planning lake flood protection measures.

  13. A preliminary analysis of the reactor-based plutonium disposition alternative deployment schedules

    International Nuclear Information System (INIS)

    Zurn, R.M.

    1997-09-01

    This paper discusses the preliminary analysis of the implementation schedules of the reactor-based plutonium disposition alternatives. These schedule analyses are a part of a larger process to examine the nine decision criteria used to determine the most appropriate method of disposing of U.S. surplus weapons plutonium. The preliminary analysis indicates that the mission durations for the reactor-based alternatives range from eleven years to eighteen years and the initial mission fuel assemblies containing surplus weapons-usable plutonium could be loaded into the reactors between nine and fourteen years after the Record of Decision

  14. Statistical methods for analysing the relationship between bank profitability and liquidity

    OpenAIRE

    Boguslaw Guzik

    2006-01-01

    The article analyses the most popular methods for the empirical estimation of the relationship between bank profitability and liquidity. Owing to the fact that profitability depends on various factors (both economic and non-economic), a simple correlation coefficient, two-dimensional (profitability/liquidity) graphs or models where profitability depends only on liquidity variable do not provide good and reliable results. Quite good results can be obtained only when multifactorial profitabilit...

  15. Preliminary design analysis of hot gas ducts and a intermediate heat exchanger for the nuclear hydrogen reactor

    International Nuclear Information System (INIS)

    Song, K. N.; Kim, Y. W.

    2008-01-01

    Korea Atomic Energy Research Institute (KAERI) is in the process of carrying out a nuclear hydrogen system by considering the indirect cycle gas cooled reactors that produce heat at temperatures in the order of 950 .deg. C. Primary and secondary hot gas ducts with coaxial double tubes and are key components connecting a reactor pressure vessel and a intermediate heat exchanger for the nuclear hydrogen system. In this study, preliminary design analyses on the hot gas ducts and the intermediate heat exchanger were carried out. These preliminary design activities include a preliminary design on the geometric dimensions, a preliminary strength evaluation, thermal sizing, and an appropriate material selection

  16. Licensing support system preliminary needs analysis: Volume 1

    International Nuclear Information System (INIS)

    1989-01-01

    This Preliminary Needs Analysis, together with the Preliminary Data Scope Analysis (next in this series of reports), is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This preliminary analysis of the LSS requirements has been divided into a ''needs'' and a ''data scope'' portion only for project management and scheduling reasons. The Preliminary Data Scope Analysis will address all issues concerning the content and size of the LSS data base; providing the requirements basis for data acquisition, cataloging and storage sizing specifications. This report addresses all other requirements for the LSS. The LSS consists of both computer subsystems and non-computer archives. This study addresses only the computer subsystems, focusing on the Access Subsystems. After providing background on previous LSS-related work, this report summarizes the findings from previous examinations of needs and describes a number of other requirements that have an impact on the LSS. The results of interviews conducted for this report are then described and analyzed. The final section of the report brings all of the key findings together and describes how these needs analyses will continue to be refined and utilized in on-going design activities. 14 refs., 2 figs., 1 tab

  17. Preliminary Seismic Performance Evaluation of RPS Cabinet in a Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung; Oh, Jinho; Lee, Jongmin; Kim, Youngki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    This RPS cabinet mainly provides the operators with the physical interface to monitor and handle the RPS. The objective of this paper is to perform seismic analyses and evaluate the preliminary structural integrity and seismic capacity of the RPS cabinet. For this purpose, a 3-D finite element model of the RPS cabinet is developed and its modal analyses are carried out for analyzing the dynamic characteristics. Response time history analyses and related safety evaluation are performed for the RPS cabinet subjected to seismic loads. Finally, the seismic margin and seismic fragility of the RPS cabinet are investigated. The seismic analysis, and preliminary structural integrity and seismic margin of the RPS cabinet under self weight and seismic load have been evaluated. For this purpose, 3-D finite element models of the RPS cabinet were developed. A modal analysis, response time history analysis, and seismic fragility analysis were then performed. From the structural analysis results, the RPS cabinet is below the structural design limit under PGA 0.3g (hor.) and 0.2g (ver.) and structurally withstands until PGA 3g (hor.) and 2g (ver.)

  18. Statistical Literacy in the Data Science Workplace

    Science.gov (United States)

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  19. Preliminary ECLSS waste water model

    Science.gov (United States)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  20. Preliminary Results of the Louisiana Sex Offender Treatment Program

    OpenAIRE

    Lee A. Underwood; Frances L.L. Dailey; Carrie Merino; Yolanda Crump

    2015-01-01

    The purpose of this study was to offer preliminary support for the Louisiana Sex Offender Treatment Program (LSOTP) in addressing the needs of juvenile sex offenders. Research objectives were (1) to offer statistical evidence for reductions in anxiety, depression, cognitive distortion and negative attitudes towards women comparing a group of 21 adolescents, 12 of whom received services as usual and nine of whom participated in the LSOTP. A controlled experimental evaluation design was utilize...

  1. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  2. Entropy statistics and information theory

    NARCIS (Netherlands)

    Frenken, K.; Hanusch, H.; Pyka, A.

    2007-01-01

    Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the

  3. Preliminary design analysis of the ALT-II limiter for TEXTOR

    International Nuclear Information System (INIS)

    Koski, J.A.; Boyd, R.D.; Kempka, S.M.; Romig, A.D. Jr.; Smith, M.F.; Watson, R.D.; Whitley, J.B.; Conn, R.W.; Grotz, S.P.

    1984-01-01

    Installation of a large toroidal belt pump limiter, Advanced Limiter Test II (ALT-II), on the TEXTOR tokamak at Juelich, FRG is anticipated for early 1986. This paper discusses the preliminary mechanical design and materials considerations undertaken as part of the feasibility study phase for ALT-II. Since the actively cooled limiter blade is the component in direct contact with the plasma edge, and thus subject to the severe plasma environment, most preliminary design efforts have concentrated on analysis of the blade. The screening process which led to the recommended preliminary design consisting of a dispersion strenghthened copper or OFHC copper cover plate over an austenitic stainless steel base plate is discussed. A 1 to 3 mm thick low atomic number coating consisting of a graded plasma-sprayed Silicon Carbide-Aluminium composite is recommended subject to further experiment and evaluation. Thermal-hydraulic and stress analyses of the limiter blade are also discussed. (orig.)

  4. Hydrogeologic characterization and evolution of the 'excavation damaged zone' by statistical analyses of pressure signals: application to galleries excavated at the clay-stone sites of Mont Terri (Ga98) and Tournemire (Ga03)

    International Nuclear Information System (INIS)

    Fatmi, H.; Ababou, R.; Matray, J.M.; Joly, C.

    2010-01-01

    Document available in extended abstract form only. This paper presents methods of statistical analysis and interpretation of hydrogeological signals in clayey formations, e.g., pore water pressure and atmospheric pressure. The purpose of these analyses is to characterize the hydraulic behaviour of this type of formation in the case of a deep repository of Mid- Level/High-Level and Long-lived radioactive wastes, and to study the evolution of the geologic formation and its EDZ (Excavation Damaged Zone) during the excavation of galleries. We focus on galleries Ga98 and Ga03 in the sites of Mont Terri (Jura, Switzerland) and Tournemire (France, Aveyron), through data collected in the BPP- 1 and PH2 boreholes, respectively. The Mont Terri site, crossing the Aalenian Opalinus clay-stone, is an underground laboratory managed by an international consortium, namely the Mont Terri project (Switzerland). The Tournemire site, crossing the Toarcian clay-stone, is an Underground Research facility managed by IRSN (France). We have analysed pore water and atmospheric pressure signals at these sites, sometimes in correlation with other data. The methods of analysis are based on the theory of stationary random signals (correlation functions, Fourier spectra, transfer functions, envelopes), and on multi-resolution wavelet analysis (adapted to nonstationary and evolutionary signals). These methods are also combined with filtering techniques, and they can be used for single signals as well as pairs of signals (cross-analyses). The objective of this work is to exploit pressure measurements in selected boreholes from the two compacted clay sites, in order to: - evaluate phenomena affecting the measurements (earth tides, barometric pressures..); - estimate hydraulic properties (specific storage..) of the clay-stones prior to excavation works and compare them with those estimated by pulse or slug tests on shorter time scales; - analyze the effects of drift excavation on pore pressures

  5. The effects of clinical and statistical heterogeneity on the predictive values of results from meta-analyses

    NARCIS (Netherlands)

    Melsen, W G; Rovers, M M; Bonten, M J M; Bootsma, M C J|info:eu-repo/dai/nl/304830305

    Variance between studies in a meta-analysis will exist. This heterogeneity may be of clinical, methodological or statistical origin. The last of these is quantified by the I(2) -statistic. We investigated, using simulated studies, the accuracy of I(2) in the assessment of heterogeneity and the

  6. The disagreeable behaviour of the kappa statistic.

    Science.gov (United States)

    Flight, Laura; Julious, Steven A

    2015-01-01

    It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.

  7. The empirical basis of substance use disorders diagnosis: research recommendations for the Diagnostic and Statistical Manual of Mental Disorders, fifth edition (DSM-V).

    Science.gov (United States)

    Schuckit, Marc A; Saunders, John B

    2006-09-01

    This paper presents the recommendations, developed from a 3-year consultation process, for a program of research to underpin the development of diagnostic concepts and criteria in the Substance Use Disorders section of the Diagnostic and Statistical Manual of Mental Disorders (DSM) and potentially the relevant section of the next revision of the International Classification of Diseases (ICD). A preliminary list of research topics was developed at the DSM-V Launch Conference in 2004. This led to the presentation of articles on these topics at a specific Substance Use Disorders Conference in February 2005, at the end of which a preliminary list of research questions was developed. This was further refined through an iterative process involving conference participants over the following year. Research questions have been placed into four categories: (1) questions that could be addressed immediately through secondary analyses of existing data sets; (2) items likely to require position papers to propose criteria or more focused questions with a view to subsequent analyses of existing data sets; (3) issues that could be proposed for literature reviews, but with a lower probability that these might progress to a data analytic phase; and (4) suggestions or comments that might not require immediate action, but that could be considered by the DSM-V and ICD 11 revision committees as part of their deliberations. A broadly based research agenda for the development of diagnostic concepts and criteria for substance use disorders is presented.

  8. Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.

    Science.gov (United States)

    Banks, N C; Hodda, M; Singh, S K; Matveeva, E M

    2012-06-01

    Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.

  9. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  10. Usage statistics and demonstrator services

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    An understanding of the use of repositories and their contents is clearly desirable for authors and repository managers alike, as well as those who are analysing the state of scholarly communications. A number of individual initiatives have produced statistics of variious kinds for individual repositories, but the real challenge is to produce statistics that can be collected and compared transparently on a global scale. This presentation details the steps to be taken to address the issues to attain this capability View Les Carr's biography

  11. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    Science.gov (United States)

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  12. Multivariate statistical methods and data mining in particle physics (4/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  13. Multivariate statistical methods and data mining in particle physics (2/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  14. Multivariate statistical methods and data mining in particle physics (1/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  15. Preliminary result on trabecular bone score (TBS in lumbar vertebrae with experimentally altered microarchitecture

    Directory of Open Access Journals (Sweden)

    M. Di Stefano

    2013-01-01

    Full Text Available The aim of this preliminary research is to investigate the reliability of a new qualitative parameter, called Trabecular Bone Score (TBS, recently proposed for evaluating the microarchitectural arrangement of cancellous bone in scans carried out by dual energy X-ray absorptiometry (DXA. Vertebral bodies of 15 fresh samples of lumbar spines of adult pig were analysed either in basal conditions and with altered microarchitecture of the cancellous bone obtained by progressive drilling. The examined bony areas do not show changes in bone mineral density (BMD, whereas TBS values decrease with the increasing alteration of the vertebral microtrabecular structure. Our preliminary data seem to confirm the reliability of TBS as a qualitative parameter useful for evaluating the microarchitectural strength in bony areas quantitatively analysed by DXA.

  16. Region-of-interest analyses of one-dimensional biomechanical trajectories: bridging 0D and 1D theory, augmenting statistical power

    Directory of Open Access Journals (Sweden)

    Todd C. Pataky

    2016-11-01

    Full Text Available One-dimensional (1D kinematic, force, and EMG trajectories are often analyzed using zero-dimensional (0D metrics like local extrema. Recently whole-trajectory 1D methods have emerged in the literature as alternatives. Since 0D and 1D methods can yield qualitatively different results, the two approaches may appear to be theoretically distinct. The purposes of this paper were (a to clarify that 0D and 1D approaches are actually just special cases of a more general region-of-interest (ROI analysis framework, and (b to demonstrate how ROIs can augment statistical power. We first simulated millions of smooth, random 1D datasets to validate theoretical predictions of the 0D, 1D and ROI approaches and to emphasize how ROIs provide a continuous bridge between 0D and 1D results. We then analyzed a variety of public datasets to demonstrate potential effects of ROIs on biomechanical conclusions. Results showed, first, that a priori ROI particulars can qualitatively affect the biomechanical conclusions that emerge from analyses and, second, that ROIs derived from exploratory/pilot analyses can detect smaller biomechanical effects than are detectable using full 1D methods. We recommend regarding ROIs, like data filtering particulars and Type I error rate, as parameters which can affect hypothesis testing results, and thus as sensitivity analysis tools to ensure arbitrary decisions do not influence scientific interpretations. Last, we describe open-source Python and MATLAB implementations of 1D ROI analysis for arbitrary experimental designs ranging from one-sample t tests to MANOVA.

  17. Football goal distributions and extremal statistics

    Science.gov (United States)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  18. Analysing the spatial patterns of livestock anthrax in Kazakhstan in relation to environmental factors: a comparison of local (Gi* and morphology cluster statistics

    Directory of Open Access Journals (Sweden)

    Ian T. Kracalik

    2012-11-01

    Full Text Available We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle and small (sheep and goats domestic ruminants across Kazakhstan. The Getis-Ord (Gi* statistic and a multidirectional optimal ecotope algorithm (AMOEBA were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149 and for small ruminants (n = 9. In contrast, Gi* revealed fewer large ruminant clusters (n = 122 and more small ruminant clusters (n = 61. Significant environmental differences were found between groups using the Kruskall-Wallis and Mann- Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.

  19. Design basis event consequence analyses for the Yucca Mountain project

    International Nuclear Information System (INIS)

    Orvis, D.D.; Haas, M.N.; Martin, J.H.

    1997-01-01

    Design basis event (DBE) definition and analysis is an ongoing and integrated activity among the design and analysis groups of the Yucca Mountain Project (YMP). DBE's are those that potentially lead to breach of the waste package and waste form (e.g., spent fuel rods) with consequent release of radionuclides to the environment. A Preliminary Hazards Analysis (PHA) provided a systematic screening of external and internal events that were candidate DBE's that will be subjected to analyses for radiological consequences. As preparation, pilot consequence analyses for the repository subsurface and surface facilities have been performed to define the methodology, data requirements, and applicable regulatory limits

  20. Safety performance of preliminary KALIMER conceptual design

    Energy Technology Data Exchange (ETDEWEB)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong [Korea atomic Energy Resarch Inst., Taejon (Korea)

    1999-07-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  1. Safety performance of preliminary KALIMER conceptual design

    International Nuclear Information System (INIS)

    Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong

    1999-01-01

    The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)

  2. Differences and discriminatory power of water polo game-related statistics in men in international championships and their relationship with the phase of the competition.

    Science.gov (United States)

    Escalante, Yolanda; Saavedra, Jose M; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Domínguez, Ana M

    2013-04-01

    The aims of this study were (a) to compare water polo game-related statistics by context (winning and losing teams) and phase (preliminary, classification, and semifinal/bronze medal/gold medal), and (b) identify characteristics that discriminate performances for each phase. The game-related statistics of the 230 men's matches played in World Championships (2007, 2009, and 2011) and European Championships (2008 and 2010) were analyzed. Differences between contexts (winning or losing teams) in each phase (preliminary, classification, and semifinal/bronze medal/gold medal) were determined using the chi-squared statistic, also calculating the effect sizes of the differences. A discriminant analysis was then performed after the sample-splitting method according to context (winning and losing teams) in each of the 3 phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables are both offensive and defensive, including action shots, sprints, goalkeeper-blocked shots, and goalkeeper-blocked action shots. However, the number of discriminatory variables decreases as the phase becomes more demanding and the teams become more equally matched. The discriminant analysis showed the game-related statistics to discriminate performance in all phases (preliminary, classificatory, and semifinal/bronze medal/gold medal phase) with high percentages (91, 90, and 73%, respectively). Again, the model selected both defensive and offensive variables.

  3. Research Pearls: The Significance of Statistics and Perils of Pooling. Part 3: Pearls and Pitfalls of Meta-analyses and Systematic Reviews.

    Science.gov (United States)

    Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman

    2017-08-01

    Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations

  4. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  5. Freshwater reservoir offsets and food crusts: Isotope, AMS, and lipid analyses of experimental cooking residues

    Science.gov (United States)

    Taché, Karine; Lovis, William A.

    2018-01-01

    Freshwater reservoir offsets (FROs) occur when AMS dates on charred, encrusted food residues on pottery predate a pot’s chronological context because of the presence of ancient carbon from aquatic resources such as fish. Research over the past two decades has demonstrated that FROs vary widely within and between water bodies and between fish in those water bodies. Lipid analyses have identified aquatic biomarkers that can be extracted from cooking residues as potential evidence for FROs. However, lacking has been efforts to determine empirically how much fish with FROs needs to be cooked in a pot with other resources to result in significant FRO on encrusted cooking residue and what percentage of fish C in a residue is needed to result in the recovery of aquatic biomarkers. Here we provide preliminary assessments of both issues. Our results indicate that in historically-contingent, high alkalinity environments fish may result in a statistically significant FRO, but that biomarkers for aquatic resources may be present in the absence of a significant FRO. PMID:29694436

  6. Preliminary cost estimating for the nuclear industry

    International Nuclear Information System (INIS)

    Klumpar, I.V.; Soltz, K.M.

    1985-01-01

    The nuclear industry has higher costs for personnel, equipment, construction, and engineering than conventional industry, which means that cost estimation procedures may need adjustment. The authors account for the special technical and labor requirements of the nuclear industry in making adjustments to equipment and installation cost estimations. Using illustrative examples, they show that conventional methods of preliminary cost estimation are flexible enough for application to emerging industries if their cost structure is similar to that of the process industries. If not, modifications can provide enough engineering and cost data for a statistical analysis. 9 references, 14 figures, 4 tables

  7. Analysis of Norwegian bio energy statistics. Quality improvement proposals; Analyse av norsk bioenergistatistikk. Forslag til kvalitetsheving

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This report is an assessment of the current model and presentation form of bio energy statistics. It appears proposed revision and enhancement of both collection and data representation. In the context of market development both in general for energy and particularly for bio energy and government targets, a good bio energy statistics form the basis to follow up the objectives and means.(eb)

  8. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  9. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  10. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    Directory of Open Access Journals (Sweden)

    Michael Robert Cunningham

    2016-10-01

    Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.

  11. Monitoring of German Fertility: Estimation of Monthly and Yearly Total Fertility Rates on the Basis of Preliminary Monthly Data

    Directory of Open Access Journals (Sweden)

    Gabriele Doblhammer

    2011-02-01

    Full Text Available This paper introduces a set of methods for estimating fertility indicators in the absence of recent and short-term birth statistics. For Germany, we propose a set of straightforward methods that allow for the computation of monthly and yearly total fertility rates (mTFR on the basis of preliminary monthly data, including a confidence interval. The method for estimating most current fertility rates can be applied when no information on the age structure and the number of women exposed to childbearing is available. The methods introduced in this study are useful for calculating monthly birth indicators, with minimal requirements for data quality and statistical effort. In addition, we suggest an approach for projecting the yearly TFR based on preliminary monthly information up to June.

  12. Increasing Statistical Literacy by Exploiting Lexical Ambiguity of Technical Terms

    Directory of Open Access Journals (Sweden)

    Jennifer Kaplan

    2018-01-01

    Full Text Available Instructional inattention to language poses a barrier for students in entry-level science courses, in part because students may perceive a subject as difficult solely based on the lack of understanding of the vocabulary. In addition, the technical use of terms that have different everyday meanings may cause students to misinterpret statements made by instructors, leading to an incomplete or incorrect understanding of the domain. Terms that have different technical and everyday meanings are said to have lexical ambiguity and statistics, as a discipline, has many lexically ambiguous terms. This paper presents a cyclic process for designing activities to address lexical ambiguity in statistics. In addition, it describes three short activities aimed to have high impact on student learning associated with two different lexically ambiguous words or word pairs in statistics. Preliminary student-level data are used to assess the efficacy of the activities, and future directions for development of activities and research about lexical ambiguity in statistics in particular and STEM in general are discussed.

  13. Statistical learning and prejudice.

    Science.gov (United States)

    Madison, Guy; Ullén, Fredrik

    2012-12-01

    Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.

  14. Preliminary ATWS analysis for the IRIS PRA

    International Nuclear Information System (INIS)

    Maddalena Barra; Marco S Ghisu; David J Finnicum; Luca Oriani

    2005-01-01

    Full text of publication follows: The pressurized light water cooled, medium power (1000 MWt) IRIS (International Reactor Innovative and Secure) has been under development for four years by an international consortium of over 21 organizations from ten countries. The plant conceptual design was completed in 2001 and the preliminary design is nearing completion. The pre-application licensing process with NRC started in October, 2002. IRIS has been primarily focused on establishing a design with innovative safety characteristics. The first line of defense in IRIS is to eliminate event initiators that could potentially lead to core damage. In IRIS, this concept is implemented through the 'safety by design' approach, which allows to minimize the number and complexity of the safety systems and required operator actions. The end result is a design with significantly reduced complexity and improved operability, and extensive plant simplifications to enhance construction. To support the optimization of the plant design and confirm the effectiveness of the safety by design approach in mitigating or eliminating events and thus providing a significant reduction in the probability of severe accidents, the PRA is being used as an integral part of the design process. A preliminary but extensive Level 1 PRA model has been developed to support the pre-application licensing of the IRIS design. As a result of the Preliminary IRIS PRA, an optimization of the design from a reliability point of view was completed, and an extremely low (about 1.2 E -8 ) core damage frequency (CDF) was assessed to confirm the impact of the safety by design approach. This first assessment is a result of a PRA model including internal initiating events. During this assessment, several assumptions were necessary to complete the CDF evaluation. In particular Anticipated Transients Without Scram (ATWS) were not included in this initial assessment, because their contribution to core damage frequency was assumed

  15. Preliminary result and upgrade from WISPDMX Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Le Hoang; Horns, Dieter [Institut fur Experimentalphysik, Universitat Hamburg (Germany); Lobanov, Andrei [Institut fur Experimentalphysik, Universitat Hamburg (Germany); Max-Planck-Institut fur Radioastronomie, Bonn (Germany)

    2016-07-01

    The microwave cavity experiment WISPDMX is the first direct WISP (Weakly interactive slim particles) dark matter search experiment probing the particle masses in the 0.8-2.0 eV range. The first stage of WISPDMX measurements has been completed at nominal resonant frequencies of the cavity. The upgrading of the data acquisition and analysing has been done to increase the sensitivity of the experiment. We report preliminary result from the cavity tuning at second stage of WISPDMX.

  16. Preliminary pharmacological screening of Bixa orellana L. leaves.

    Science.gov (United States)

    Shilpi, Jamil Ahmad; Taufiq-Ur-Rahman, Md; Uddin, Shaikh Jamal; Alam, Md Shahanur; Sadhu, Samir Kumar; Seidel, Véronique

    2006-11-24

    Preliminary pharmacological studies were performed on the methanol extract of Bixa orellana L. (Bixaceae) leaves to investigate neuropharmacological, anticonvulsant, analgesic, antidiarrhoeal activity and effect on gastrointestinal motility. All studies were conducted in mice using doses of 125, 250 and 500 mg/kg of body weight. In the pentobarbitone-induced hypnosis test, the extract statistically reduced the time for the onset of sleep at 500 mg/kg dose and (dose-dependently) increased the total sleeping time at 250 and 500 mg/kg dose. A statistically significant decrease in locomotor activity was observed at all doses in the open-field and hole-cross tests. In the strychnine-induced anticonvulsant test, the extract increased the average survival time of the test animals (statistically significant at 250 and 500 mg/kg). The extract significantly and dose-dependently reduced the writhing reflex in the acetic acid-induced writhing test. Antidiarrhoeal activity was supported by a statistically significant decrease in the total number of stools (including wet stools) in castor oil-induced diarrhoea model. A statistically significant delay in the passage of charcoal meal was observed at 500 mg/kg in the gastrointestinal motility test. The extract was further evaluated in vitro for antioxidant and antibacterial activity. It revealed radical scavenging properties in the DPPH assay (IC(50)=22.36 microg/ml) and antibacterial activity against selected causative agents of diarrhoea and dysentery, including Shigella dysenteriae.

  17. Preliminary uncertainty analysis of pre-waste-emplacement groundwater travel times for a proposed repository in basalt

    International Nuclear Information System (INIS)

    Clifton, P.M.; Arnett, R.C.

    1984-01-01

    Preliminary uncertainty analyses of pre-waste-emplacement groundwater travel times are presented for a potential high-level nuclear waste repository in the deep basalts beneath the Hanford Site, Washington State. The uncertainty analyses are carried out by means of a Monte Carlo technique, which requires the uncertain inputs to be described as either random variables or spatial stochastic processes. Pre-waste-emplacement groundwater travel times are modeled in a continuous, flat-lying basalt flow top that is assumed to overlie the repository horizon. Two-dimensional, steady state groundwater flow is assumed, and transmissivity, effective thickness, and regional hydraulic gradient are considered as uncertain inputs. Groundwater travel time distributions corresponding to three groundwater models are presented and compared. Limitations of these preliminary simulation results are discussed in detail

  18. Preliminary verification of structure design for CN HCCB TBM with 1 × 4 configuration

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Zhou, E-mail: zhaozhou@swip.ac.cn; Zhou, Bing; Wang, Qijie; Cao, Qixiang; Feng, Kaiming; Wang, Xiaoyu; Zhang, Guoshu

    2016-02-15

    Highlights: • A new and simplification structural design scheme with 1 × 4 configuration is proposed for CN HCCB TBM. • The detail conceptual structural design for 1 × 4 TBM is completed. • The preliminary hydraulic analysis, thermo-hydraulic analysis and structural analysis for 1 × 4 TBM had been carried out. - Abstract: Based on the conceptual design of CN HCCB TBM with 1 × 4 configuration, the preliminary hydraulic analysis, thermo-hydraulic analysis and structural analysis had been carried out for it. Hydraulic and thermo-hydraulic analyses show that the coolant manifold system could meet the fluid design requirement preliminarily and the temperature of RAFMs structural parts, Be and Li{sub 4}SiO{sub 4} pebble beds are within the allowable range, and no zone shows a stress higher than the allowable limit in the preliminary structural analysis. These results indicate the design for CN HCCB TBM with 1 × 4 configuration is preliminary reasonable.

  19. Preliminary verification of structure design for CN HCCB TBM with 1 × 4 configuration

    International Nuclear Information System (INIS)

    Zhao, Zhou; Zhou, Bing; Wang, Qijie; Cao, Qixiang; Feng, Kaiming; Wang, Xiaoyu; Zhang, Guoshu

    2016-01-01

    Highlights: • A new and simplification structural design scheme with 1 × 4 configuration is proposed for CN HCCB TBM. • The detail conceptual structural design for 1 × 4 TBM is completed. • The preliminary hydraulic analysis, thermo-hydraulic analysis and structural analysis for 1 × 4 TBM had been carried out. - Abstract: Based on the conceptual design of CN HCCB TBM with 1 × 4 configuration, the preliminary hydraulic analysis, thermo-hydraulic analysis and structural analysis had been carried out for it. Hydraulic and thermo-hydraulic analyses show that the coolant manifold system could meet the fluid design requirement preliminarily and the temperature of RAFMs structural parts, Be and Li_4SiO_4 pebble beds are within the allowable range, and no zone shows a stress higher than the allowable limit in the preliminary structural analysis. These results indicate the design for CN HCCB TBM with 1 × 4 configuration is preliminary reasonable.

  20. Preservation of photographic and cinematographic films by gamma radiation: Preliminary analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nagai, Maria Luiza E.; Santos, Paulo S.; Otubo, Larissa; Oliveira, Maria José A.; Vasquez, Pablo A.S., E-mail: malunagai@usp.br, E-mail: pavsalva@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Brazilian weather conditions affect directly tangible materials causing deterioration notably getting worse by insects and fungi attack. In this sense, gamma radiation provided from the cobalt-60 is an excellent alternative tool to the traditional preservation process mainly because it has biocidal action. Radiation processing using gamma radiation for cultural heritage materials for disinfection has been widely used around the world in the last decades. Many cultural heritage objects especially made on paper and wood were studied in scientific publications aiming mechanical, physical and chemical properties changes. Over the last fifteen years, the Multipurpose Gamma Irradiation Facility of the Nuclear and Energy Research Institute located inside the Sao Paulo University campus has been irradiated many collections of archived materials, books, paintings and furniture. Adequate storage of photographic and cinematographic materials is a challenge for conservators from preservation institutions. Contamination by fungi is one of leading causes of problem in photographic and cinematographic collections. Several Sao Paulo University libraries have been affected by fungi in their photographic and cinematographic collections making it impossible to research on these materials either manipulate them for health and safety reasons. In this work are presented preliminary results of effects of the ionizing radiation in photographic and cinematographic films. Selected film samples made on cellulose acetate were prepared and characterized by FTIR-ATR spectroscopy. Samples were irradiated by gamma rays with absorbed dose between 2 kGy and 50 kGy. Irradiated samples were analyzed by UV-VIS spectroscopy and electron microscopy techniques. Results shown that disinfection by gamma radiation can be achieved safely applying the disinfection dose between 6 kGy to 15 kGy with no significant change or modification of main properties of the constitutive materials. (author)

  1. Preservation of photographic and cinematographic films by gamma radiation: Preliminary analyses

    International Nuclear Information System (INIS)

    Nagai, Maria Luiza E.; Santos, Paulo S.; Otubo, Larissa; Oliveira, Maria José A.; Vasquez, Pablo A.S.

    2017-01-01

    Brazilian weather conditions affect directly tangible materials causing deterioration notably getting worse by insects and fungi attack. In this sense, gamma radiation provided from the cobalt-60 is an excellent alternative tool to the traditional preservation process mainly because it has biocidal action. Radiation processing using gamma radiation for cultural heritage materials for disinfection has been widely used around the world in the last decades. Many cultural heritage objects especially made on paper and wood were studied in scientific publications aiming mechanical, physical and chemical properties changes. Over the last fifteen years, the Multipurpose Gamma Irradiation Facility of the Nuclear and Energy Research Institute located inside the Sao Paulo University campus has been irradiated many collections of archived materials, books, paintings and furniture. Adequate storage of photographic and cinematographic materials is a challenge for conservators from preservation institutions. Contamination by fungi is one of leading causes of problem in photographic and cinematographic collections. Several Sao Paulo University libraries have been affected by fungi in their photographic and cinematographic collections making it impossible to research on these materials either manipulate them for health and safety reasons. In this work are presented preliminary results of effects of the ionizing radiation in photographic and cinematographic films. Selected film samples made on cellulose acetate were prepared and characterized by FTIR-ATR spectroscopy. Samples were irradiated by gamma rays with absorbed dose between 2 kGy and 50 kGy. Irradiated samples were analyzed by UV-VIS spectroscopy and electron microscopy techniques. Results shown that disinfection by gamma radiation can be achieved safely applying the disinfection dose between 6 kGy to 15 kGy with no significant change or modification of main properties of the constitutive materials. (author)

  2. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  3. Hydrogen Gas Retention and Release from WTP Vessels: Summary of Preliminary Studies

    Energy Technology Data Exchange (ETDEWEB)

    Gauglitz, Phillip A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bontha, Jagannadha R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daniel, Richard C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mahoney, Lenna A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rassat, Scot D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wells, Beric E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bao, Jie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Boeringa, Gregory K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Buchmiller, William C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burns, Carolyn A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chun, Jaehun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Karri, Naveen K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Li, Huidong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tran, Diana N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    The Hanford Waste Treatment and Immobilization Plant (WTP) is currently being designed and constructed to pretreat and vitrify a large portion of the waste in the 177 underground waste storage tanks at the Hanford Site. A number of technical issues related to the design of the pretreatment facility (PTF) of the WTP have been identified. These issues must be resolved prior to the U.S. Department of Energy (DOE) Office of River Protection (ORP) reaching a decision to proceed with engineering, procurement, and construction activities for the PTF. One of the issues is Technical Issue T1 - Hydrogen Gas Release from Vessels (hereafter referred to as T1). The focus of T1 is identifying controls for hydrogen release and completing any testing required to close the technical issue. In advance of selecting specific controls for hydrogen gas safety, a number of preliminary technical studies were initiated to support anticipated future testing and to improve the understanding of hydrogen gas generation, retention, and release within PTF vessels. These activities supported the development of a plan defining an overall strategy and approach for addressing T1 and achieving technical endpoints identified for T1. Preliminary studies also supported the development of a test plan for conducting testing and analysis to support closing T1. Both of these plans were developed in advance of selecting specific controls, and in the course of working on T1 it was decided that the testing and analysis identified in the test plan were not immediately needed. However, planning activities and preliminary studies led to significant technical progress in a number of areas. This report summarizes the progress to date from the preliminary technical studies. The technical results in this report should not be used for WTP design or safety and hazards analyses and technical results are marked with the following statement: “Preliminary Technical Results for Planning – Not to be used for WTP Design

  4. Prediction of Seismic Slope Displacements by Dynamic Stick-Slip Analyses

    International Nuclear Information System (INIS)

    Ausilio, Ernesto; Costanzo, Antonio; Silvestri, Francesco; Tropeano, Giuseppe

    2008-01-01

    A good-working balance between simplicity and reliability in assessing seismic slope stability is represented by displacement-based methods, in which the effects of deformability and ductility can be either decoupled or coupled in the dynamic analyses. In this paper, a 1D lumped mass ''stick-slip'' model is developed, accounting for soil heterogeneity and non-linear behaviour, with a base sliding mechanism at a potential rupture surface. The results of the preliminary calibration show a good agreement with frequency-domain site response analysis in no-slip conditions. The comparison with rigid sliding block analyses and with the decoupled approach proves that the stick-slip procedure can result increasingly unconservative for soft soils and deep sliding depths

  5. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  6. Decision-making in probability and statistics Chilean curriculum

    DEFF Research Database (Denmark)

    Elicer, Raimundo

    2018-01-01

    Probability and statistics have become prominent subjects in school mathematics curricula. As an exemplary case, I investigate the role of decision making in the justification for probability and statistics in the current Chilean upper secondary mathematics curriculum. For addressing this concern......, I draw upon Fairclough’s model for Critical Discourse Analysis to analyse selected texts as examples of discourse practices. The texts are interconnected with politically driven ideas of stochastics “for all”, the notion of statistical literacy coined by statisticians’ communities, schooling...

  7. Preliminary concepts for materials measurement and accounting in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.

    1978-01-01

    Preliminary concepts are presented for improved materials measurement and accounting in large critical facilities. These concepts will be developed as part of a study that will emphasize international safeguarding of critical facilities. The major safeguards problem is the timely verification of in-reactor inventory during periods of reactor operation. This will require a combination of measurement, statistical sampling, and data analysis techniques. Promising techniques include integral measurements of reactivity and other reactor parameters that are sensitive to the total fissile inventory, and nondestructive assay measurements of the fissile material in reactor fuel drawers and vault storage canisters coupled with statistical sampling plans tailored for the specific application. The effectiveness of proposed measurement and accounting strategies will be evaluated during the study

  8. Preliminary analyses of Li jet flows for the IFMIF target

    International Nuclear Information System (INIS)

    Ida, Mizuho; Kato, Yoshio; Nakamura, Hideo; Maekawa, Hiroshi

    1997-03-01

    The characteristics of liquid lithium (Li) plane jet flowing along a concave wall were studied using a multi-dimensional numerical code, FLOW-3D, as part of the two-year conceptual design activity (CDA) of the International Fusion Materials Irradiation Facility (IFMIF) that started in February 1995. The IFMIF will provide high flux, high energy (∼14MeV) neutron irradiation field by deuteron-Li reaction in the Li jet target for testing and development of low-activation and damage-resistant fusion materials. The Li jet target flow at high-velocity (≤ 20m/s) in vacuum, and should adequately remove the intense deuteron beam power (≤ 10MW). The two-dimensional analyses on the thermal and hydraulic responses of the target flow, under the conditions proposed in the IFMIF-CDA, indicated enough temperature margins to avoid significant vaporization and voiding respectively at the jet free surface and the peak temperature location in the jet by keeping the flow stability. (author)

  9. A statistical evaluation of asbestos air concentrations

    International Nuclear Information System (INIS)

    Lange, J.H.

    1999-01-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm - - 3 of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm -3 of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  10. Validation of Refractivity Profiles Retrieved from FORMOSAT-3/COSMIC Radio Occultation Soundings: Preliminary Results of Statistical Comparisons Utilizing Balloon-Borne Observations

    Directory of Open Access Journals (Sweden)

    Hiroo Hayashi

    2009-01-01

    Full Text Available The GPS radio occultation (RO soundings by the FORMOSAT-3/COSMIC (Taiwan¡¦s Formosa Satellite Misssion #3/Constellation Observing System for Meteorology, Ionosphere and Climate satellites launched in mid-April 2006 are compared with high-resolution balloon-borne (radiosonde and ozonesonde observations. This paper presents preliminary results of validation of the COSMIC RO measurements in terms of refractivity through the troposphere and lower stratosphere. With the use of COSMIC RO soundings within 2 hours and 300 km of sonde profiles, statistical comparisons between the collocated refractivity profiles are erformed for some tropical regions (Malaysia and Western Pacific islands where moisture-rich air is expected in the lower troposphere and for both northern and southern polar areas with a very dry troposphere. The results of the comparisons show good agreement between COSMIC RO and sonde refractivity rofiles throughout the troposphere (1 - 1.5% difference at most with a positive bias generally becoming larger at progressively higher altitudes in the lower stratosphere (1 - 2% difference around 25 km, and a very small standard deviation (about 0.5% or less for a few kilometers below the tropopause level. A large standard deviation of fractional differences in the lowermost troposphere, which reaches up to as much as 3.5 - 5%at 3 km, is seen in the tropics while a much smaller standard deviation (1 - 2% at most is evident throughout the polar troposphere.

  11. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Preliminary Performance Assessment for Disposal of APT and CLWR/TEF Wastes at SRS

    International Nuclear Information System (INIS)

    Wilhite, E.L.

    1998-01-01

    This section provides the descriptive information for understanding the analyses presented in this preliminary performance assessment. This section addresses the approach taken in the PA, provides a general description of the Savannah River Site E-Area low-level waste facility, and discusses the performance criteria used for evaluating performance

  13. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... in the external environment. The nature and extent of the practical use of quantitative techniques in corporate environmental management systems is discussed on the basis of a number of company surveys in four European countries.......Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  14. Selection, rejection and optimisation of pyrolytic graphite (PG) crystal analysers for use on the new IRIS graphite analyser bank

    International Nuclear Information System (INIS)

    Marshall, P.J.; Sivia, D.S.; Adams, M.A.; Telling, M.T.F.

    2000-01-01

    This report discusses design problems incurred by equipping the IRIS high-resolution inelastic spectrometer at the ISIS pulsed neutron source, UK with a new 4212 piece pyrolytic graphite crystal analyser array. Of the 4212 graphite pieces required, approximately 2500 will be newly purchased PG crystals with the remainder comprising of the currently installed graphite analysers. The quality of the new analyser pieces, with respect to manufacturing specifications, is assessed, as is the optimum arrangement of new PG pieces amongst old to circumvent degradation of the spectrometer's current angular resolution. Techniques employed to achieve these criteria include accurate calliper measurements, FORTRAN programming and statistical analysis. (author)

  15. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  16. Determination of the minimum size of a statistical representative volume element from a fibre-reinforced composite based on point pattern statistics

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimu...... size of an SRVE based on a statistical analysis on the spatial statistics of the fibre packing patterns found in genuine laminates, and those generated numerically using a microstructure generator. © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved....

  17. Preliminary results of the analysis of the administered activities in diagnostic studies of nuclear medicine

    International Nuclear Information System (INIS)

    Lopez Bejerano, G.; Sed, L.J.

    2001-01-01

    The worldwide use of Nuclear Medicine diagnostic procedures and the tendency to its increment, infers an important exposure of the population to ionising radiation; it has motivated that the IAEA in the International Basic Safety Standards (BSS), emits recommendations for the establishment of guidance levels of activities administered to the patients in diagnostic procedures. Taking into account the above-mentioned and that in Cuba there exist 20 departments of Nuclear Medicine that in the majority possess equipment with more than 20 years of operation, which influences directly the medical exposure. A survey was designed and applied in 10 of these departments. The survey evaluates the compliance with the BSS requirements, and specifically, the activities administered to the patients in Nuclear Medicine diagnostic procedures are analysed. In the present work the obtained preliminary results of the statistical analysis carried out on the activity values used in Nuclear Medicine departments are presented, and comparisons made for a proposal of guidance levels for the national practice, which is compared with those recommended internationally. (author)

  18. Replication unreliability in psychology: elusive phenomena or elusive statistical power?

    Directory of Open Access Journals (Sweden)

    Patrizio E Tressoldi

    2012-07-01

    Full Text Available The focus of this paper is to analyse whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power.Applying the Null Hypothesis Statistical Testing (NHST, still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out.Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size of the typical study, is low or very low.The low power in most studies undermines the use of NHST to study phenomena with moderate or low effect sizes.We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small effect size.

  19. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial

  20. Research design and statistical methods in Indian medical journals: a retrospective survey.

    Science.gov (United States)

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, pdesign decreased significantly (χ2=16.783, Φ=0.12 pdesigns has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, presearch seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of

  1. Wind Statistics from a Forested Landscape

    DEFF Research Database (Denmark)

    Arnqvist, Johan; Segalini, Antonio; Dellwik, Ebba

    2015-01-01

    An analysis and interpretation of measurements from a 138-m tall tower located in a forested landscape is presented. Measurement errors and statistical uncertainties are carefully evaluated to ensure high data quality. A 40(Formula presented.) wide wind-direction sector is selected as the most...... representative for large-scale forest conditions, and from that sector first-, second- and third-order statistics, as well as analyses regarding the characteristic length scale, the flux-profile relationship and surface roughness are presented for a wide range of stability conditions. The results are discussed...

  2. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.

  3. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    International Nuclear Information System (INIS)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef

    2014-01-01

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers

  4. How the Mastery Rubric for Statistical Literacy Can Generate Actionable Evidence about Statistical and Quantitative Learning Outcomes

    Directory of Open Access Journals (Sweden)

    Rochelle E. Tractenberg

    2016-12-01

    Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.

  5. Preliminary analyses for HTTR's start-up physics tests by Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Nojiri, Naoki; Nakano, Masaaki; Ando, Hiroei; Fujimoto, Nozomu; Takeuchi, Mitsuo; Fujisaki, Shingo; Yamashita, Kiyonobu

    1998-08-01

    Analyses of start-up physics tests for High Temperature Engineering Test Reactor (HTTR) have been carried out by Monte Carlo code MVP based on continuous energy method. Heterogeneous core structures were modified precisely, such as the fuel compacts, fuel rods, coolant channels, burnable poisons, control rods, control rod insertion holes, reserved shutdown pellet insertion holes, gaps between graphite blocks, etc. Such precise modification of the core structures was difficult with diffusion calculation. From the analytical results, the followings were confirmed; The first criticality will be achieved around 16 fuel columns loaded. The reactivity at the first criticality can be controlled by only one control rod located at the center of the core with other fifteen control rods fully withdrawn. The excess reactivity, reactor shutdown margin and control rod criticality positions have been evaluated. These results were used for planning of the start-up physics tests. This report presents analyses of start-up physics tests for HTTR by MVP code. (author)

  6. Sunspot activity and influenza pandemics: a statistical assessment of the purported association.

    Science.gov (United States)

    Towers, S

    2017-10-01

    Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.

  7. Data analysis for radiological characterisation: Geostatistical and statistical complementarity

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    Radiological characterisation may cover a large range of evaluation objectives during a decommissioning and dismantling (D and D) project: removal of doubt, delineation of contaminated materials, monitoring of the decontamination work and final survey. At each stage, collecting relevant data to be able to draw the conclusions needed is quite a big challenge. In particular two radiological characterisation stages require an advanced sampling process and data analysis, namely the initial categorization and optimisation of the materials to be removed and the final survey to demonstrate compliance with clearance levels. On the one hand the latter is widely used and well developed in national guides and norms, using random sampling designs and statistical data analysis. On the other hand a more complex evaluation methodology has to be implemented for the initial radiological characterisation, both for sampling design and for data analysis. The geostatistical framework is an efficient way to satisfy the radiological characterisation requirements providing a sound decision-making approach for the decommissioning and dismantling of nuclear premises. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. Thus geo-statistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, leading to a sound classification of radiological waste (surfaces and volumes). This way, the radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive or not) surface survey of the contamination is implemented on a regular grid. Finally, in order to assess activity levels and contamination depths, destructive samples are collected at several locations within the premises (based on the surface survey results) and analysed. Combined with

  8. Sensitivity analyses of fast reactor systems including thorium and uranium

    International Nuclear Information System (INIS)

    Marable, J.H.; Weisbin, C.R.

    1978-01-01

    The Cross Section Evaluation Working Group (CSEWG) has, in conjunction with the development of the fifth version of ENDF/B, assembled new evaluations for 232 Th and 233 U. It is the purpose of this paper to describe briefly some of the more important features of these evaluations relative to ENDF/B-4 to project the change in reactor performance based upon the newer evaluated files and sensitivity coefficients for interesting design problems, and to indicate preliminary results from ongoing uncertainty analyses

  9. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  10. Barkhausen Effect and Acoustic Emission in a Metallic Glass - Preliminary Results

    International Nuclear Information System (INIS)

    Lopez Sanchez, R.; Lopez Pumarega, M.I.; Armeite, M.; Piotrkowski, R.; Ruzzante, J.E.

    2004-01-01

    Magneto Acoustic Emission, which is Barkhausen Noise (BN) and Acoustic Emission (AE), depends on microstructure and existing residual stresses in magnetic materials. Preliminary results obtained by magnetization along two perpendicular directions on a metal glass foil are presented. Signals were analyzed with Statistic, Fast Fourier and Wavelet methods. Results are part of a Joint Research Project of the Faculty of Science, Cantabria University, Spain, and the Elastic Waves Group of the National Atomic Energy Commission, Argentina

  11. Athens 10 x 20 NTMS area, Georgia and South Carolina: preliminary basic data report

    International Nuclear Information System (INIS)

    Ferguson, R.B.

    1978-01-01

    This report presents preliminary results of ground water and stream sediment reconnaissance in the National Topographic Map Series (NTMS) Athens 1 0 x 2 0 quadrangle. Stream sediment samples were collected from small streams at 1200 sites for a nominal density of one site per 18 square kilometers (seven square miles) in rural areas. Ground water samples were collected at 771 sites for a nominal density of one site per 28 square kilometers (eleven square miles). Neutron activation analysis (NAA) results are given for uranium and 16 other elements in sediments, and for uranium and 9 other elements in ground water. Field measurements and observations are reported for each site. Analytical data and field measurements are presented in tables and maps. Statistical summaries of data and a brief description of results are given. A generalized geologic map and a summary of the geology of the area are included. Key data are presented in page-sized hard copy. Supplementary data are on microfiche. Key data from ground water sites (Appendix A) include (1) water chemistry measurements (pH, conductivity, and alkalinity), (2) well depth, and (3) elemental analyses (U, Br, Cl, F, Mg, Mn, Na, and V). Supplementary data include site descriptors, (well age, frequency of use of well, etc.) and analytical data for Al and Dy. Key data from stream sediment sites (Appendix B) include (1) water quality measurements (pH, conductivity, and alkalinity), and (2) important elemental analyses (U, Th, Hf, Al, Ce, Fe, Mn, Na, Sc, Ti, and V). Supplementary data from stream sediment sites include sample site descriptors (stream characteristics, vegetation, etc.), and additional elemental analyses

  12. New applications of statistical tools in plant pathology.

    Science.gov (United States)

    Garrett, K A; Madden, L V; Hughes, G; Pfender, W F

    2004-09-01

    ABSTRACT The series of papers introduced by this one address a range of statistical applications in plant pathology, including survival analysis, nonparametric analysis of disease associations, multivariate analyses, neural networks, meta-analysis, and Bayesian statistics. Here we present an overview of additional applications of statistics in plant pathology. An analysis of variance based on the assumption of normally distributed responses with equal variances has been a standard approach in biology for decades. Advances in statistical theory and computation now make it convenient to appropriately deal with discrete responses using generalized linear models, with adjustments for overdispersion as needed. New nonparametric approaches are available for analysis of ordinal data such as disease ratings. Many experiments require the use of models with fixed and random effects for data analysis. New or expanded computing packages, such as SAS PROC MIXED, coupled with extensive advances in statistical theory, allow for appropriate analyses of normally distributed data using linear mixed models, and discrete data with generalized linear mixed models. Decision theory offers a framework in plant pathology for contexts such as the decision about whether to apply or withhold a treatment. Model selection can be performed using Akaike's information criterion. Plant pathologists studying pathogens at the population level have traditionally been the main consumers of statistical approaches in plant pathology, but new technologies such as microarrays supply estimates of gene expression for thousands of genes simultaneously and present challenges for statistical analysis. Applications to the study of the landscape of the field and of the genome share the risk of pseudoreplication, the problem of determining the appropriate scale of the experimental unit and of obtaining sufficient replication at that scale.

  13. Rayleigh to Compton ratio scatter tomography applied to breast cancer diagnosis: A preliminary computational study

    International Nuclear Information System (INIS)

    Antoniassi, M.; Conceição, A.L.C.; Poletti, M.E.

    2014-01-01

    In the present work, a tomographic technique based on Rayleigh to Compton scattering ratio (R/C) was studied using computational simulation in order to assess its application to breast cancer diagnosis. In this preliminary study, some parameters that affect the image quality were evaluated, such as: (i) energy beam, (ii) size and glandularity of the breast, and (iii) statistical count noise. The results showed that the R/C contrast increases with increasing photon energy and decreases with increasing glandularity of the sample. The statistical noise showed to be a significant parameter, although the quality of the obtained images was acceptable for a considerable range of noise level. The preliminary results suggest that the R/C tomographic technique has a potential of being applied as a complementary tool in the breast cancer diagnostic. - Highlights: ► A tomographic technique based on Rayleigh to Compton scattering ratio is proposed in order to study breast tissues. ► The Rayleigh to Compton scattering ratio technique is compared with conventional transmission technique. ► The influence of experimental parameters (energy, sample, detection system) is studied

  14. Statistical aspects of the cleanup of Enewetak Atoll

    International Nuclear Information System (INIS)

    Giacomini, J.J.; Miller, F.L. Jr.

    1981-01-01

    The Desert Research Institute participated in the Enewetak Atoll Radiological Cleanup by providing data-base management and statistical analysis support for the Department of Energy team. The data-base management responsibilities included both design and implementation of a system for recording (in machine-retrievable form) all radiological measurements made during the cleanup, excluding personnel dosimetry. Statistical analyses were performed throughout the cleanup and were used to guide excavation activities

  15. Robust statistics for deterministic and stochastic gravitational waves in non-Gaussian noise. II. Bayesian analyses

    International Nuclear Information System (INIS)

    Allen, Bruce; Creighton, Jolien D.E.; Flanagan, Eanna E.; Romano, Joseph D.

    2003-01-01

    In a previous paper (paper I), we derived a set of near-optimal signal detection techniques for gravitational wave detectors whose noise probability distributions contain non-Gaussian tails. The methods modify standard methods by truncating or clipping sample values which lie in those non-Gaussian tails. The methods were derived, in the frequentist framework, by minimizing false alarm probabilities at fixed false detection probability in the limit of weak signals. For stochastic signals, the resulting statistic consisted of a sum of an autocorrelation term and a cross-correlation term; it was necessary to discard 'by hand' the autocorrelation term in order to arrive at the correct, generalized cross-correlation statistic. In the present paper, we present an alternative derivation of the same signal detection techniques from within the Bayesian framework. We compute, for both deterministic and stochastic signals, the probability that a signal is present in the data, in the limit where the signal-to-noise ratio squared per frequency bin is small, where the signal is nevertheless strong enough to be detected (integrated signal-to-noise ratio large compared to 1), and where the total probability in the non-Gaussian tail part of the noise distribution is small. We show that, for each model considered, the resulting probability is to a good approximation a monotonic function of the detection statistic derived in paper I. Moreover, for stochastic signals, the new Bayesian derivation automatically eliminates the problematic autocorrelation term

  16. A statistical evaluation of asbestos air concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Lange, J.H. [Envirosafe Training and Consultants, Pittsburgh, PA (United States)

    1999-07-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm{sup -}-{sup 3} of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm{sup -3} of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  17. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  18. Statistical Analysis Of Reconnaissance Geochemical Data From ...

    African Journals Online (AJOL)

    , Co, Mo, Hg, Sb, Tl, Sc, Cr, Ni, La, W, V, U, Th, Bi, Sr and Ga in 56 stream sediment samples collected from Orle drainage system were subjected to univariate and multivariate statistical analyses. The univariate methods used include ...

  19. Economic Analyses of Ware Yam Production in Orlu Agricultural ...

    African Journals Online (AJOL)

    Economic Analyses of Ware Yam Production in Orlu Agricultural Zone of Imo State. ... International Journal of Agriculture and Rural Development ... statistics, gross margin analysis, marginal analysis and multiple regression analysis. Results ...

  20. JAWS data collection, analysis highlights, and microburst statistics

    Science.gov (United States)

    Mccarthy, J.; Roberts, R.; Schreiber, W.

    1983-01-01

    Organization, equipment, and the current status of the Joint Airport Weather Studies project initiated in relation to the microburst phenomenon are summarized. Some data collection techniques and preliminary statistics on microburst events recorded by Doppler radar are discussed as well. Radar studies show that microbursts occur much more often than expected, with majority of the events being potentially dangerous to landing or departing aircraft. Seventy events were registered, with the differential velocities ranging from 10 to 48 m/s; headwind/tailwind velocity differentials over 20 m/s are considered seriously hazardous. It is noted that a correlation is yet to be established between the velocity differential and incoherent radar reflectivity.

  1. Orthorexia nervosa in the general population: a preliminary screening using a self-administered questionnaire (ORTO-15).

    Science.gov (United States)

    Ramacciotti, C E; Perrone, P; Coli, E; Burgalassi, A; Conversano, C; Massimetti, G; Dell'Osso, L

    2011-06-01

    Orthorexia, from the Greek words orthos (straight, proper) and orexis (appetite), is a newly conceptualized disorder characterized by distorted eating habits and cognitions concerning supposedly healthy nutrition. In this article we present preliminary results of a wider research aimed to investigate the diffusion of Orthorexia in the general population and to highlight its characteristics and particularly the relationship with Eating Disorder and Obsessive-Compulsive Disorder. One-hundred and seventy seven adult subjects from the general population, were administered the ORTO-15 test, a selfadministered questionnaire specifically designed to assess orthorexic symptomatology; note that statistical analyses were repeated twice, referring to different diagnostic thresholds (40/35). Orthorexia had a 57.6% prevalence in our sample, using the 40-point threshold, with a female/male ratio 2:1; the figure was sensibly lower with the 35-point threshold (21%). The results of this study highlight the diffusion of Orthorexia which may constitute an important risk factor for mental and physical health, but also the opportunity of more specific diagnostic instruments, so to facilitate a thorough understanding of this disorder.

  2. A Controlled Randomized Preliminary Trial of a Modified Dissonance-Based Eating Disorder Intervention Program.

    Science.gov (United States)

    Green, M A; Willis, M; Fernandez-Kong, K; Reyes, S; Linkhart, R; Johnson, M; Thorne, T; Lindberg, J; Kroska, E; Woodward, H

    2017-12-01

    We conducted a controlled randomized preliminary trial of a modified dissonance-based eating disorder program (n = 24) compared to an assessment-only control condition (n = 23) via a longitudinal design (baseline, postintervention, 2-month follow-up) in a community sample of women (N = 47) with clinical (n = 22) and subclinical (n = 25) eating disorder symptoms. The traditional content of the Body Project, a dissonance-based eating disorder prevention program, was modified to include verbal, written, and behavioral exercises designed to dissuade self-objectification and maladaptive social comparison. Women with clinical and subclinical symptoms were included in the target audience to investigate both the treatment and the indicated prevention utility of the modified dissonance program. Body dissatisfaction, self-esteem, self-objectification, thin-ideal internalization, maladaptive social comparison, trait anxiety, and eating disorder symptoms were evaluated in the control and the modified dissonance condition at baseline, postintervention, and 2-month follow-up. We predicted a statistically significant 2 (condition: control, modified dissonance) x 3 (time: baseline, postintervention, 2-month follow-up) interaction in the mixed factorial multivariate analyses of variance results. Results confirmed this hypothesis. Eating disorder risk factors and symptoms decreased significantly among participants in the modified dissonance condition at postintervention and 2-month follow-up compared to baseline; symptom improvement was greater among participants in the modified compared to the control condition. A secondary analysis indicated symptom improvement did not vary as a function of symptom status (clinical, subclinical), suggesting the program is efficacious in both indicated prevention and treatment applications. Results provide preliminary support for the modified dissonance program. © 2017 Wiley Periodicals, Inc.

  3. 77 FR 2345 - Advisory Council on Transportation Statistics; Request for Nominations

    Science.gov (United States)

    2012-01-17

    ... and Innovative Technology Administration (RITA), Bureau of Transportation Statistics (BTS), DOT... BTS solicits nominations for interested persons to serve on the ACTS. The ACTS is composed of BTS... transportation statistics and analyses collected, supported, or disseminated by the BTS and DOT. DATES...

  4. Novel Application of Statistical Methods to Identify New Urinary Incontinence Risk Factors

    Directory of Open Access Journals (Sweden)

    Theophilus O. Ogunyemi

    2012-01-01

    Full Text Available Longitudinal data for studying urinary incontinence (UI risk factors are rare. Data from one study, the hallmark Medical, Epidemiological, and Social Aspects of Aging (MESA, have been analyzed in the past; however, repeated measures analyses that are crucial for analyzing longitudinal data have not been applied. We tested a novel application of statistical methods to identify UI risk factors in older women. MESA data were collected at baseline and yearly from a sample of 1955 men and women in the community. Only women responding to the 762 baseline and 559 follow-up questions at one year in each respective survey were examined. To test their utility in mining large data sets, and as a preliminary step to creating a predictive index for developing UI, logistic regression, generalized estimating equations (GEEs, and proportional hazard regression (PHREG methods were used on the existing MESA data. The GEE and PHREG combination identified 15 significant risk factors associated with developing UI out of which six of them, namely, urinary frequency, urgency, any urine loss, urine loss after emptying, subject’s anticipation, and doctor’s proactivity, are found most highly significant by both methods. These six factors are potential candidates for constructing a future UI predictive index.

  5. Possession experiences in dissociative identity disorder: a preliminary study.

    Science.gov (United States)

    Ross, Colin A

    2011-01-01

    Dissociative trance disorder, which includes possession experiences, was introduced as a provisional diagnosis requiring further study in the Diagnostic and Statistical Manual of Mental Disorders (4th ed.). Consideration is now being given to including possession experiences within dissociative identity disorder (DID) in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.), which is due to be published in 2013. In order to provide empirical data relevant to the relationship between DID and possession states, I analyzed data on the prevalence of trance, possession states, sleepwalking, and paranormal experiences in 3 large samples: patients with DID from North America; psychiatric outpatients from Shanghai, China; and a general population sample from Winnipeg, Canada. Trance, sleepwalking, paranormal, and possession experiences were much more common in the DID patients than in the 2 comparison samples. The study is preliminary and exploratory in nature because the samples were not matched in any way.

  6. Crystallization and preliminary X-ray diffraction analyses of pseudechetoxin and pseudecin, two snake-venom cysteine-rich secretory proteins that target cyclic nucleotide-gated ion channels

    International Nuclear Information System (INIS)

    Suzuki, Nobuhiro; Yamazaki, Yasuo; Fujimoto, Zui; Morita, Takashi; Mizuno, Hiroshi

    2005-01-01

    Crystals of pseudechetoxin and pseudecin, potent peptidic inhibitors of cyclic nucleotide-gated ion channels, have been prepared and X-ray diffraction data have been collected to 2.25 and 1.90 Å resolution, respectively. Cyclic nucleotide-gated (CNG) ion channels play pivotal roles in sensory transduction of retinal and olfactory neurons. The elapid snake toxins pseudechetoxin (PsTx) and pseudecin (Pdc) are the only known protein blockers of CNG channels. These toxins are structurally classified as cysteine-rich secretory proteins and exhibit structural features that are quite distinct from those of other known small peptidic channel blockers. This article describes the crystallization and preliminary X-ray diffraction analyses of these toxins. Crystals of PsTx belonged to space group P2 1 2 1 2 1 , with unit-cell parameters a = 60.30, b = 61.59, c = 251.69 Å, and diffraction data were collected to 2.25 Å resolution. Crystals of Pdc also belonged to space group P2 1 2 1 2 1 , with similar unit-cell parameters a = 60.71, b = 61.67, c = 251.22 Å, and diffraction data were collected to 1.90 Å resolution

  7. Preliminary reports in the emergency department: is a subspecialist radiologist more accurate than a radiology resident?

    Science.gov (United States)

    Branstetter, Barton F; Morgan, Matthew B; Nesbit, Chadd E; Phillips, Jinnah A; Lionetti, David M; Chang, Paul J; Towers, Jeffrey D

    2007-02-01

    To determine whether emergency department (ED) preliminary reports rendered by subspecialist attending radiologists who are reading outside their field of expertise are more accurate than reports rendered by radiology residents, and to compare error rates between radiologists and nonradiologists in the ED setting. The study was performed at a large academic medical center with a busy ED. An electronic preliminary report generator was used in the ED to capture preliminary interpretations rendered in a clinical setting by radiology residents, junior attendings (within 2 years of taking their oral boards), senior attendings, and ED clinicians between August 1999 and November 2004. Each preliminary report was later reviewed by a final interpreting radiologist, and the preliminary interpretation was adjudicated for the presence of substantial discordances, defined as a difference in interpretation that might immediately impact the care of the patient. Of the 612,890 preliminary reports in the database, 65,780 (11%) met inclusion criteria for this study. A log-linear analysis was used to assess the effects of modality and type of author on preliminary report error rates. ED clinicians had significantly higher error rates when compared with any type of radiologist, regardless of modality. Within the radiologists, residents and junior attendings had lower error rates than did senior attendings, but the differences were not statistically significant. Subspecialized attending radiologists who interpret ED examinations outside their area of expertise have error rates similar to those of radiology residents. Nonradiologists have significantly higher error rates than radiologists and radiology residents when interpreting examinations in the ED.

  8. GRIST-2 preliminary test plan and requirements for fuel fabrication and preirradiation

    International Nuclear Information System (INIS)

    Tang, I.M.; Harmon, D.P.; Torri, A.

    1978-12-01

    The preliminary version of the GRIST-2 test plan has been developed for the planned initial 5 years (1984 to 1989) of TREAT-Upgrade in-pile tests. These tests will be employed to study the phenomenology and integral behavior of GCFR core disruptive accidents (CDAs) and to support the Final Safety Analysis Report (FSAR) CDA analyses for the demonstration plant licensing. The preliminary test plan is outlined. Test Phases I and II are for the fresh fuel (preconditioned or not) CDA behavior at the beginning-of-life (BOL) reactor state. Phase III is for the reactor state that contains irradiated fuel with a saturated content of helium and fission gas. Phase IV is for larger bundle tests and scaling effects

  9. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    Science.gov (United States)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  10. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  11. Preliminary tank characterization report for single-shell tank 241-TX-101: best-basis inventory

    International Nuclear Information System (INIS)

    Kupfer, M.J.

    1997-01-01

    This document is a preliminary Tank Characterization Report (TCR). It only contains the current best-basis inventory (Appendix D) for single-shell tank 241-TX-101. No TCRs have been previously issued for this tank, and current core sample analyses are not available. The best-basis inventory, therefore, is based on an engineering assessment of waste type, process flowsheet data, early sample data, and/or other available information. The Standard Inventories of Chemicals and Radionuclides in Hanford Site Tank Wastes describes standard methodology used to derive the tank-by-tank best-basis inventories. This preliminary TCR will be updated using this same methodology when additional data on tank contents become available

  12. Preliminary tank characterization report for single-shell tank 241-TY-102: best-basis inventory

    International Nuclear Information System (INIS)

    Place, D.E.

    1997-01-01

    This document is a preliminary Tank Characterization Report (TCR). It only contains the current best-basis inventory (Appendix D) for single-shell tank 241-TY-102. No TCRs have been previously issued for this tank, and current core sample analyses are not available. The best-basis inventory, therefore, is based on an engineering assessment of waste type, process flowsheet data, early sample data, and/or other available information. The Standard Inventories of Chemicals and Radionuclides in Hanford Site Tank Wastes describes standard methodology used to derive the tank-by-tank best-basis inventories. This preliminary TCR will be updated using this same methodology when additional data on tank contents become available

  13. Preliminary tank characterization report for single-shell tank 241-TX-113: best-basis inventory

    International Nuclear Information System (INIS)

    Place, D.E.

    1997-01-01

    This document is a preliminary Tank Characterization Report (TCR). It only contains the current best-basis inventory (Appendix D) for single-shell tank 241-TX-113. No TCRs have been previously issued for this tank, and current core sample analyses are not available. The best-basis inventory, therefore, is based on an engineering assessment of waste type, process flowsheet data, early sample data, and/or other available information. The Standard Inventories of Chemicals and Radionuclides in Hanford Site Tank Wastes describes standard methodology used to derive the tank-by-tank best-basis inventories. This preliminary TCR will be updated using this same methodology when additional data on tank contents become available

  14. Statistical evaluations concerning the failure behaviour of formed parts with superheated steam flow. Pt. 1

    International Nuclear Information System (INIS)

    Oude-Hengel, H.H.; Vorwerk, K.; Heuser, F.W.; Boesebeck, K.

    1976-01-01

    Statistical evaluations concerning the failure behaviour of formed parts with superheated-steam flow were carried out using data from VdTUEV inventory and failure statistics. Due to the great number of results, the findings will be published in two volumes. This first part will describe and classify the stock of data and will make preliminary quantitative statements on failure behaviour. More differentiated statements are made possible by including the operation time and the number of start-ups per failed part. On the basis of time-constant failure rates some materials-specific statements are given. (orig./ORU) [de

  15. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  16. Categorization of the trophic status of a hydroelectric power plant reservoir in the Brazilian Amazon by statistical analyses and fuzzy approaches.

    Science.gov (United States)

    da Costa Lobato, Tarcísio; Hauser-Davis, Rachel Ann; de Oliveira, Terezinha Ferreira; Maciel, Marinalva Cardoso; Tavares, Maria Regina Madruga; da Silveira, Antônio Morais; Saraiva, Augusto Cesar Fonseca

    2015-02-15

    The Amazon area has been increasingly suffering from anthropogenic impacts, especially due to the construction of hydroelectric power plant reservoirs. The analysis and categorization of the trophic status of these reservoirs are of interest to indicate man-made changes in the environment. In this context, the present study aimed to categorize the trophic status of a hydroelectric power plant reservoir located in the Brazilian Amazon by constructing a novel Water Quality Index (WQI) and Trophic State Index (TSI) for the reservoir using major ion concentrations and physico-chemical water parameters determined in the area and taking into account the sampling locations and the local hydrological regimes. After applying statistical analyses (factor analysis and cluster analysis) and establishing a rule base of a fuzzy system to these indicators, the results obtained by the proposed method were then compared to the generally applied Carlson and a modified Lamparelli trophic state index (TSI), specific for trophic regions. The categorization of the trophic status by the proposed fuzzy method was shown to be more reliable, since it takes into account the specificities of the study area, while the Carlson and Lamparelli TSI do not, and, thus, tend to over or underestimate the trophic status of these ecosystems. The statistical techniques proposed and applied in the present study, are, therefore, relevant in cases of environmental management and policy decision-making processes, aiding in the identification of the ecological status of water bodies. With this, it is possible to identify which factors should be further investigated and/or adjusted in order to attempt the recovery of degraded water bodies. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Correlating tephras and cryptotephras using glass compositional analyses and numerical and statistical methods: Review and evaluation

    Science.gov (United States)

    Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.

    2017-11-01

    We define tephras and cryptotephras and their components (mainly ash-sized particles of glass ± crystals in distal deposits) and summarize the basis of tephrochronology as a chronostratigraphic correlational and dating tool for palaeoenvironmental, geological, and archaeological research. We then document and appraise recent advances in analytical methods used to determine the major, minor, and trace elements of individual glass shards from tephra or cryptotephra deposits to aid their correlation and application. Protocols developed recently for the electron probe microanalysis of major elements in individual glass shards help to improve data quality and standardize reporting procedures. A narrow electron beam (diameter ∼3-5 μm) can now be used to analyze smaller glass shards than previously attainable. Reliable analyses of 'microshards' (defined here as glass shards T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain multiple compositional groups. Currently, incomplete databases are tending to limit correlation efficacy. The development of an open, online global database to facilitate progress towards integrated, high

  18. PRELIMINARY RESULTS OF BOWL TRAPPING BEES (HYMENOPTERA, APOIDEA IN A SOUTHERN BRAZIL FOREST FRAGMENT

    Directory of Open Access Journals (Sweden)

    Rodrigo B. Gonçalves

    2013-05-01

    Full Text Available In recent years bowl traps have gained attention as a useful method for sampling bees and are now commonly used across the world for this purpose. However, specific questions about the method itself have not yet been tested on different regions of the globe. We present the preliminary results of bowl trapping in a Semidecidual Seasonal forest fragment in southern Brazil, including the test of two different color bowls, two different habitats, and the interaction of these variables in bee species number and composition. We used blue and yellow bowls in the border and in the core trails of the forest fragment. In five sampling days between October to December bowl traps captured 745 specimens of 37 morphospecies, with Halictinae bees being the richest and most abundant group. Non parametrical statistical analyses suggested that different colors of bowl traps influenced bee richness and composition and thus, they should be used together for a more complete sampling. Different trails influenced only the composition, while the interaction with different colors did not have a significant effect. These results, as well as the higher taxonomic composition of the inventoried bees, are similar to other studies reported in the literature.

  19. Statistical analysis on hollow and core-shell structured vanadium oxide microspheres as cathode materials for Lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Xing Liang

    2018-06-01

    Full Text Available In this data, the statistical analyses of vanadium oxide microspheres cathode materials are presented for the research article entitled “Statistical analyses on hollow and core-shell structured vanadium oxides microspheres as cathode materials for Lithium ion batteries” (Liang et al., 2017 [1]. This article shows the statistical analyses on N2 adsorption-desorption isotherm and morphology vanadium oxide microspheres as cathode materials for LIBs. Keywords: Adsorption-desorption isotherm, Pore size distribution, SEM images, TEM images

  20. Preliminary drivers associated with household food waste generation in South Africa

    CSIR Research Space (South Africa)

    Ramukhwatho, Fhumulani R

    2017-12-01

    Full Text Available in Microsoft Excel, and analysed using the chi-square statistical test and SAS statistical software. The main reasons for household food wastage were identified as buying in excess, preparation of more food than would be consumed, poor storage, poor purchase...

  1. Preliminary design report: Prototypical Spent Fuel Consolidation Equipment Demonstration Project: Phase 1

    International Nuclear Information System (INIS)

    Blissell, W.H.; Ciez, A.P.; Mitchell, J.L.; Winkler, C.J.

    1986-12-01

    This document describes the Westinghouse Preliminary Design for the Prototypical Consolidation Demonstration Project per Department of Energy (DOE) Contract No. DE-AC07-86ID12649 and under direction of the DOE Idaho Operations Office. The preliminary design is the first step to providing the Department of Energy with a fully qualified, licensable, cost-effective spent fuel rod consolidation system. The design was developed using proven technologies and equipment to create an innovative approach to previous rod consolidation concepts. These innovations will better enable the Westinghouse system to: consolidate fuel rods in a precise, fully-controlled, accountable manner; package all rods from two PWR fuel assemblies or from four BWR fuel assemblies in one 8.5 inch square consolidated rods canister; meet all functional requirements; operate with all fuel types common to the US commercial nuclear industry with minimal tooling changeouts; and meet consolidation production process rates, while maintaining operator and public health and safety. This Preliminary Design Report provides both detailed descriptions of the equipment required to perform the rod consolidation process and the supporting analyses to validate the design

  2. Cross section analyses in MiniBooNE and SciBooNE experiments

    OpenAIRE

    Katori, Teppei

    2013-01-01

    The MiniBooNE experiment (2002-2012) and the SciBooNE experiment (2007-2008) are modern high statistics neutrino experiments, and they developed many new ideas in neutrino cross section analyses. In this note, I discuss selected topics of these analyses.

  3. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  4. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  5. Misuse of statistics in the interpretation of data on low-level radiation

    International Nuclear Information System (INIS)

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds

  6. Misuse of statistics in the interpretation of data on low-level radiation

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  7. Use Of R in Statistics Lithuania

    Directory of Open Access Journals (Sweden)

    Tomas Rudys

    2016-06-01

    Full Text Available Recently R becoming more and more popular among official statistics offices. It can be used not even for research purposes, but also for a production of official statistics. Statistics Lithuania recently started an analysis of possibilities where R can be used and could it replace some other statistical programming languages or systems. For this reason a work group was arranged. In the paper we will present overview of the current situation on implementation of R in Statistics Lithuania, some problems we are chasing with and some future plans. At the current situation R is used mainly for research purposes. Looking for- ward a short courses on basic R was prepared and at the moment we are starting to use R for data analysis, data manipulation from Oracle data bases, some reports preparation, data editing, survey estimation. On the other hand we found some problems working with big data sets, also survey sampling as there are surveys with complex sampling designs. We are also analysing the running of R on our servers in order to have possibilities to use more random access memory (RAM. Despite the problems, we are trying to use R in more fields in production of official statistics.

  8. Active cooling for downhole instrumentation: Preliminary analysis and system selection

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, G.A.

    1988-03-01

    A feasibility study and a series of preliminary designs and analyses were done to identify candidate processes or cycles for use in active cooling systems for downhole electronic instruments. A matrix of energy types and their possible combinations was developed and the energy conversion process for each pari was identified. The feasibility study revealed conventional as well as unconventional processes and possible refrigerants and identified parameters needing further clarifications. A conceptual design or series od oesigns for each system was formulated and a preliminary analysis of each design was completed. The resulting coefficient of performance for each system was compared with the Carnot COP and all systems were ranked by decreasing COP. The system showing the best combination of COP, exchangeability to other operating conditions, failure mode, and system serviceability is chosen for use as a downhole refrigerator. 85 refs., 48 figs., 33 tabs.

  9. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  10. Statistics for X-chromosome associations.

    Science.gov (United States)

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  11. MELCOR 1.8.2 Analyses in Support of ITER's RPrS

    International Nuclear Information System (INIS)

    Brad J Merrill

    2008-01-01

    The International Thermonuclear Experimental Reactor (ITER) Program is performing accident analyses for ITER's 'Rapport Preliminaire de Surete' (Report Preliminary on Safety - RPrS) with a modified version of the MELCOR 1.8.2 code. The RPrS is an ITER safety document required in the ITER licensing process to obtain a 'Decret Autorisation de Construction' (a Decree Authorizing Construction - DAC) for the ITER device. This report documents the accident analyses performed by the US with the MELCOR 1.8.2 code in support of the ITER RPrS effort. This work was funded through an ITER Task Agreement for MELCOR Quality Assurance and Safety Analyses. Under this agreement, the US was tasked with performing analyses for three accident scenarios in the ITER facility. Contained within the text of this report are discussions that identify the cause of these accidents, descriptions of how these accidents are likely to proceed, the method used to analyze the consequences of these accidents, and discussions of the transient thermal hydraulic and radiological release results for these accidents

  12. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  13. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Science.gov (United States)

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  14. Loss-of-coolant accident analyses of the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Chen, N.C.J.; Yoder, G.L.; Wendel, M.W.

    1991-01-01

    Currently in the conceptual design stage, the Advanced Neutron Source Reactor (ANSR) will operate at a high heat flux, a high mass flux, an a high degree of coolant subcooling. Loss-of-coolant accident (LOCA) analyses using RELAP5 have been performed as part of an early evaluation of ANSR safety issues. This paper discusses the RELAP5 ANSR conceptual design system model and preliminary LOCA simulation results. Some previous studies were conducted for the preconceptual design. 12 refs., 7 figs

  15. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  16. A Preliminary Randomized Controlled Trial of Multifaceted Educational Intervention for Mild Cognitive Impairment Among Elderly Malays in Kuala Lumpur

    Directory of Open Access Journals (Sweden)

    Sa’ida Munira Johari

    2014-06-01

    Conclusion: A 12-month educational intervention on nutritional, lifestyle, and cognitive exercise significantly improved nutritional status, knowledge, and attitude score. The study lacked power to demonstrate a statistically significant positive effect on cognitive functioning; thus, the preliminary findings should be confirmed in a larger trial.

  17. Factorial Structure and Preliminary Validation of the Schema Mode Inventory for Eating Disorders (SMI-ED

    Directory of Open Access Journals (Sweden)

    Susan G. Simpson

    2018-04-01

    Full Text Available Objective: The aim of this study was to examine the psychometric properties and factorial structure of the Schema Mode Inventory for Eating Disorders (SMI-ED in a disordered eating population.Method: 573 participants with disordered eating patterns as measured by the Eating Disorder Examination Questionnaire (EDE-Q completed the 190-item adapted version of the Schema Mode Inventory (SMI. The new SMI-ED was developed by clinicians/researchers specializing in the treatment of eating disorders, through combining items from the original SMI with a set of additional questions specifically representative of the eating disorder population. Psychometric testing included Confirmatory Factor Analysis (CFA and internal consistency (Cronbach's α. Multivariate Analyses of Covariance (MANCOVA was also run to test statistical differences between the EDE-Q subscales on the SMI-ED modes, while controlling for possible confounding variables.Results: Factorial analysis confirmed an acceptable 16-related-factors solution for the SMI-ED, thus providing preliminary evidence for the adequate validity of the new measure based on internal structure. Concurrent validity was also established through moderate to high correlations on the modes most relevant to eating disorders with EDE-Q subscales. This study represents the first step in creating a psychometrically sound instrument for measuring schema modes in eating disorders, and provides greater insight into the relevant schema modes within this population.Conclusion: This research represents an important preliminary step toward understanding and labeling the schema mode model for this clinical group. Findings from the psychometric evaluation of SMI-ED suggest that this is a useful tool which may further assist in the measurement and conceptualization of schema modes in this population.

  18. Factorial Structure and Preliminary Validation of the Schema Mode Inventory for Eating Disorders (SMI-ED).

    Science.gov (United States)

    Simpson, Susan G; Pietrabissa, Giada; Rossi, Alessandro; Seychell, Tahnee; Manzoni, Gian Mauro; Munro, Calum; Nesci, Julian B; Castelnuovo, Gianluca

    2018-01-01

    Objective: The aim of this study was to examine the psychometric properties and factorial structure of the Schema Mode Inventory for Eating Disorders (SMI-ED) in a disordered eating population. Method: 573 participants with disordered eating patterns as measured by the Eating Disorder Examination Questionnaire (EDE-Q) completed the 190-item adapted version of the Schema Mode Inventory (SMI). The new SMI-ED was developed by clinicians/researchers specializing in the treatment of eating disorders, through combining items from the original SMI with a set of additional questions specifically representative of the eating disorder population. Psychometric testing included Confirmatory Factor Analysis (CFA) and internal consistency (Cronbach's α). Multivariate Analyses of Covariance (MANCOVA) was also run to test statistical differences between the EDE-Q subscales on the SMI-ED modes, while controlling for possible confounding variables. Results: Factorial analysis confirmed an acceptable 16-related-factors solution for the SMI-ED, thus providing preliminary evidence for the adequate validity of the new measure based on internal structure. Concurrent validity was also established through moderate to high correlations on the modes most relevant to eating disorders with EDE-Q subscales. This study represents the first step in creating a psychometrically sound instrument for measuring schema modes in eating disorders, and provides greater insight into the relevant schema modes within this population. Conclusion: This research represents an important preliminary step toward understanding and labeling the schema mode model for this clinical group. Findings from the psychometric evaluation of SMI-ED suggest that this is a useful tool which may further assist in the measurement and conceptualization of schema modes in this population.

  19. Factorial Structure and Preliminary Validation of the Schema Mode Inventory for Eating Disorders (SMI-ED)

    Science.gov (United States)

    Simpson, Susan G.; Pietrabissa, Giada; Rossi, Alessandro; Seychell, Tahnee; Manzoni, Gian Mauro; Munro, Calum; Nesci, Julian B.; Castelnuovo, Gianluca

    2018-01-01

    Objective: The aim of this study was to examine the psychometric properties and factorial structure of the Schema Mode Inventory for Eating Disorders (SMI-ED) in a disordered eating population. Method: 573 participants with disordered eating patterns as measured by the Eating Disorder Examination Questionnaire (EDE-Q) completed the 190-item adapted version of the Schema Mode Inventory (SMI). The new SMI-ED was developed by clinicians/researchers specializing in the treatment of eating disorders, through combining items from the original SMI with a set of additional questions specifically representative of the eating disorder population. Psychometric testing included Confirmatory Factor Analysis (CFA) and internal consistency (Cronbach's α). Multivariate Analyses of Covariance (MANCOVA) was also run to test statistical differences between the EDE-Q subscales on the SMI-ED modes, while controlling for possible confounding variables. Results: Factorial analysis confirmed an acceptable 16-related-factors solution for the SMI-ED, thus providing preliminary evidence for the adequate validity of the new measure based on internal structure. Concurrent validity was also established through moderate to high correlations on the modes most relevant to eating disorders with EDE-Q subscales. This study represents the first step in creating a psychometrically sound instrument for measuring schema modes in eating disorders, and provides greater insight into the relevant schema modes within this population. Conclusion: This research represents an important preliminary step toward understanding and labeling the schema mode model for this clinical group. Findings from the psychometric evaluation of SMI-ED suggest that this is a useful tool which may further assist in the measurement and conceptualization of schema modes in this population. PMID:29740379

  20. Preliminary corrosion models for BWIP [Basalt Waste Isolation Project] canister materials

    International Nuclear Information System (INIS)

    Fish, R.L.; Anantatmula, R.P.

    1983-01-01

    Waste package development for the Basalt Waste Isolation Project (BWIP) requires the generation of materials degradation data under repository relevant conditions. These data are used to develop predictive models for the behavior of each component of waste package. The component models are exercised in performance analyses to optimize the waste package design. This document presents all repository relevant canister materials corrosion data that the BWIP and others have developed to date, describes the methodology used to develop preliminary corrosion models and provides the mathematical description of the models for both low carbon steel and Fe9Cr1Mo steel. Example environment/temperature history and model application calculations are presented to aid in understanding the models. The models are preliminary in nature and will be updated as additional corrosion data become available. 6 refs., 5 tabs

  1. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    International Nuclear Information System (INIS)

    Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J

    2015-01-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics

  2. Direct Learning of Systematics-Aware Summary Statistics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Complex machine learning tools, such as deep neural networks and gradient boosting algorithms, are increasingly being used to construct powerful discriminative features for High Energy Physics analyses. These methods are typically trained with simulated or auxiliary data samples by optimising some classification or regression surrogate objective. The learned feature representations are then used to build a sample-based statistical model to perform inference (e.g. interval estimation or hypothesis testing) over a set of parameters of interest. However, the effectiveness of the mentioned approach can be reduced by the presence of known uncertainties that cause differences between training and experimental data, included in the statistical model via nuisance parameters. This work presents an end-to-end algorithm, which leverages on existing deep learning technologies but directly aims to produce inference-optimal sample-summary statistics. By including the statistical model and a differentiable approximation of ...

  3. Preliminary Analysis of the (Process and Product) Quality of Physical Education in Flemish Secondary Schools: Implementation of IKLO

    Science.gov (United States)

    Huts, K.; Van Hoecke, J.; De Knop, P.; Theeboom, M.

    2009-01-01

    The purpose of the present study was twofold, namely implementing a multifunctional (self-) evaluation instrument for physical education in a sample of Flemish secondary schools (N=100), while simultaneously obtaining a preliminary picture of the subjects' product and process quality. Descriptive statistics revealed that P. E. teachers' engagement…

  4. Assessing antiquity and turnover of terrestrial ecosystems in eastern North America using fossil pollen data: A preliminary study

    International Nuclear Information System (INIS)

    Liu Yao; Jackson, Stephen T; Brewer, Simon; Williams, John W

    2010-01-01

    We explored formal approaches to identifying and interpreting the antiquity and turnover of terrestrial ecosystems in eastern North America using pollen records. Preliminary results of cluster analyses, receiver-operating characteristic (ROC) analyses, and likelihood estimation of ecosystem analog in a simple Bayesian model allow assessment of modern ecosystem antiquities and past ecosystem turnovers. Approaches discussed in this study thus provide a vehicle for further studies.

  5. Preliminary Monthly Climatological Summaries

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Preliminary Local Climatological Data, recorded since 1970 on Weather Burean Form 1030 and then National Weather Service Form F-6. The preliminary climate data pages...

  6. Conjunction analysis and propositional logic in fMRI data analysis using Bayesian statistics.

    Science.gov (United States)

    Rudert, Thomas; Lohmann, Gabriele

    2008-12-01

    To evaluate logical expressions over different effects in data analyses using the general linear model (GLM) and to evaluate logical expressions over different posterior probability maps (PPMs). In functional magnetic resonance imaging (fMRI) data analysis, the GLM was applied to estimate unknown regression parameters. Based on the GLM, Bayesian statistics can be used to determine the probability of conjunction, disjunction, implication, or any other arbitrary logical expression over different effects or contrast. For second-level inferences, PPMs from individual sessions or subjects are utilized. These PPMs can be combined to a logical expression and its probability can be computed. The methods proposed in this article are applied to data from a STROOP experiment and the methods are compared to conjunction analysis approaches for test-statistics. The combination of Bayesian statistics with propositional logic provides a new approach for data analyses in fMRI. Two different methods are introduced for propositional logic: the first for analyses using the GLM and the second for common inferences about different probability maps. The methods introduced extend the idea of conjunction analysis to a full propositional logic and adapt it from test-statistics to Bayesian statistics. The new approaches allow inferences that are not possible with known standard methods in fMRI. (c) 2008 Wiley-Liss, Inc.

  7. National Data Center Preparedness Exercise 2015 (NPE 2015): MY-NDC Preliminary Analysis Result

    International Nuclear Information System (INIS)

    Faisal Izwan Abdul Rashid; Muhammed Zulfakar Zolkaffly

    2016-01-01

    Malaysia has established the CTBT National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as provide information for Treaty related events to Nuclear Malaysia as CTBT National Authority. In 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015). This paper aims at presenting MY-NDC preliminary analysis result of NPE 2015. In NPE 2015, MY-NDC has performed five different analyses, namely, radionuclide, atmospheric transport modelling (ATM), data fusion, seismic analysis and site forensics. The preliminary findings show the hypothetical scenario in NPE 2015 most probably is an uncontained event resulted high release of radionuclide to the air. (author)

  8. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  9. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  10. Preliminary Safety Analysis Report for the Tokamak Physics Experiment

    International Nuclear Information System (INIS)

    Motloch, C.G.; Bonney, R.F.; Levine, J.D.; Masson, L.S.; Commander, J.C.

    1995-04-01

    This Preliminary Safety Analysis Report (PSAR), includes an indication of the magnitude of facility hazards, complexity of facility operations, and the stage of the facility life-cycle. It presents the results of safety analyses, safety assurance programs, identified vulnerabilities, compensatory measures, and, in general, the rationale describing why the Tokamak Physics Experiment (TPX) can be safely operated. It discusses application of the graded approach to the TPX safety analysis, including the basis for using Department of Energy (DOE) Order 5480.23 and DOE-STD-3009-94 in the development of the PSAR

  11. The effectiveness of tipi in the treatment of hip and knee osteoarthritis: a preliminary report

    Directory of Open Access Journals (Sweden)

    Marcos Bosi Ferraz

    1991-01-01

    Full Text Available Osteoarthritis (OA is a common painful inflammatory condition occurring mainly in the later half of life. Hipe and knee are the joints mostly affected. Petiveria alliacea (tipi popularly known as an anti-rheumatic medicine, has been used by OA patients to relief pain. This one-week cross-over double-blind trial has preliminary evaluated the analgesic effect of tipi tea in 14 patients with hip and knee OA. Imperata exaltata (sape was used as the Placebo tea. The pain assessments that were made at baseline and before the start of the second treatment period by treatment groups were comparable. While taking tipi or placebo tea patients experienced a statistically significant improvement in pain on motion and pain at night. The comparison between the improvements reported while on tipi and placebo tea, however, did not disclose any statistically significant difference. At the conclusion of the study 7 patients preferred tipi tea and 6 preferred placebo tea (NS. Two patients reported insomnia, one durign placebo treatment and the other during tipi treatment. In this preliminary report both teas succeeded in the aim of relieving pain.

  12. Preliminary analyses for HTTR`s start-up physics tests by Monte Carlo code MVP

    Energy Technology Data Exchange (ETDEWEB)

    Nojiri, Naoki [Science and Technology Agency, Tokyo (Japan); Nakano, Masaaki; Ando, Hiroei; Fujimoto, Nozomu; Takeuchi, Mitsuo; Fujisaki, Shingo; Yamashita, Kiyonobu

    1998-08-01

    Analyses of start-up physics tests for High Temperature Engineering Test Reactor (HTTR) have been carried out by Monte Carlo code MVP based on continuous energy method. Heterogeneous core structures were modified precisely, such as the fuel compacts, fuel rods, coolant channels, burnable poisons, control rods, control rod insertion holes, reserved shutdown pellet insertion holes, gaps between graphite blocks, etc. Such precise modification of the core structures was difficult with diffusion calculation. From the analytical results, the followings were confirmed; The first criticality will be achieved around 16 fuel columns loaded. The reactivity at the first criticality can be controlled by only one control rod located at the center of the core with other fifteen control rods fully withdrawn. The excess reactivity, reactor shutdown margin and control rod criticality positions have been evaluated. These results were used for planning of the start-up physics tests. This report presents analyses of start-up physics tests for HTTR by MVP code. (author)

  13. Homeopathy: meta-analyses of pooled clinical data.

    Science.gov (United States)

    Hahn, Robert G

    2013-01-01

    In the first decade of the evidence-based era, which began in the mid-1990s, meta-analyses were used to scrutinize homeopathy for evidence of beneficial effects in medical conditions. In this review, meta-analyses including pooled data from placebo-controlled clinical trials of homeopathy and the aftermath in the form of debate articles were analyzed. In 1997 Klaus Linde and co-workers identified 89 clinical trials that showed an overall odds ratio of 2.45 in favor of homeopathy over placebo. There was a trend toward smaller benefit from studies of the highest quality, but the 10 trials with the highest Jadad score still showed homeopathy had a statistically significant effect. These results challenged academics to perform alternative analyses that, to demonstrate the lack of effect, relied on extensive exclusion of studies, often to the degree that conclusions were based on only 5-10% of the material, or on virtual data. The ultimate argument against homeopathy is the 'funnel plot' published by Aijing Shang's research group in 2005. However, the funnel plot is flawed when applied to a mixture of diseases, because studies with expected strong treatments effects are, for ethical reasons, powered lower than studies with expected weak or unclear treatment effects. To conclude that homeopathy lacks clinical effect, more than 90% of the available clinical trials had to be disregarded. Alternatively, flawed statistical methods had to be applied. Future meta-analyses should focus on the use of homeopathy in specific diseases or groups of diseases instead of pooling data from all clinical trials. © 2013 S. Karger GmbH, Freiburg.

  14. Preliminary study of soil permeability properties using principal component analysis

    Science.gov (United States)

    Yulianti, M.; Sudriani, Y.; Rustini, H. A.

    2018-02-01

    Soil permeability measurement is undoubtedly important in carrying out soil-water research such as rainfall-runoff modelling, irrigation water distribution systems, etc. It is also known that acquiring reliable soil permeability data is rather laborious, time-consuming, and costly. Therefore, it is desirable to develop the prediction model. Several studies of empirical equations for predicting permeability have been undertaken by many researchers. These studies derived the models from areas which soil characteristics are different from Indonesian soil, which suggest a possibility that these permeability models are site-specific. The purpose of this study is to identify which soil parameters correspond strongly to soil permeability and propose a preliminary model for permeability prediction. Principal component analysis (PCA) was applied to 16 parameters analysed from 37 sites consist of 91 samples obtained from Batanghari Watershed. Findings indicated five variables that have strong correlation with soil permeability, and we recommend a preliminary permeability model, which is potential for further development.

  15. Systematic reviews of anesthesiologic interventions reported as statistically significant

    DEFF Research Database (Denmark)

    Imberger, Georgina; Gluud, Christian; Boylan, John

    2015-01-01

    statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...

  16. Understanding the statistics of small risks

    International Nuclear Information System (INIS)

    Siddall, E.

    1983-10-01

    Monte Carlo analyses are used to show what inferences can and cannot be drawn when either a very small number of accidents result from a considerable exposure or where a very small number of people, down to a single individual, are exposed to small added risks. The distinction between relative and absolute uncertainty is illustrated. No new statistical principles are involved

  17. Preliminary Evaluation of an Aviation Safety Thesaurus' Utility for Enhancing Automated Processing of Incident Reports

    Science.gov (United States)

    Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok

    2007-01-01

    This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.

  18. Book Trade Research and Statistics. Prices of U.S. and Foreign Published Materials; Book Title Output and Average Prices: 2000 Final and 2001 Preliminary Figures; Book Sales Statistics, 2001: AAP Preliminary Estimates; U.S. Book Exports and Imports: 2001; Number of Book Outlets in the United States and Canada; Review Media Statistics.

    Science.gov (United States)

    Sullivan, Sharon G.; Barr, Catherine; Grabois, Andrew

    2002-01-01

    Includes six articles that report on prices of U.S. and foreign published materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and review media statistics. (LRW)

  19. Vector-field statistics for the analysis of time varying clinical gait data.

    Science.gov (United States)

    Donnelly, C J; Alexander, C; Pataky, T C; Stannage, K; Reid, S; Robinson, M A

    2017-01-01

    In clinical settings, the time varying analysis of gait data relies heavily on the experience of the individual(s) assessing these biological signals. Though three dimensional kinematics are recognised as time varying waveforms (1D), exploratory statistical analysis of these data are commonly carried out with multiple discrete or 0D dependent variables. In the absence of an a priori 0D hypothesis, clinicians are at risk of making type I and II errors in their analyis of time varying gait signatures in the event statistics are used in concert with prefered subjective clinical assesment methods. The aim of this communication was to determine if vector field waveform statistics were capable of providing quantitative corroboration to practically significant differences in time varying gait signatures as determined by two clinically trained gait experts. The case study was a left hemiplegic Cerebral Palsy (GMFCS I) gait patient following a botulinum toxin (BoNT-A) injection to their left gastrocnemius muscle. When comparing subjective clinical gait assessments between two testers, they were in agreement with each other for 61% of the joint degrees of freedom and phases of motion analysed. For tester 1 and tester 2, they were in agreement with the vector-field analysis for 78% and 53% of the kinematic variables analysed. When the subjective analyses of tester 1 and tester 2 were pooled together and then compared to the vector-field analysis, they were in agreement for 83% of the time varying kinematic variables analysed. These outcomes demonstrate that in principle, vector-field statistics corroborates with what a team of clinical gait experts would classify as practically meaningful pre- versus post time varying kinematic differences. The potential for vector-field statistics to be used as a useful clinical tool for the objective analysis of time varying clinical gait data is established. Future research is recommended to assess the usefulness of vector-field analyses

  20. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Science.gov (United States)

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the... on Documenting Statistical Analysis Programs and Data Files; Availability'' giving interested persons...

  1. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    Science.gov (United States)

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be

  2. SPSS for applied sciences basic statistical testing

    CERN Document Server

    Davis, Cole

    2013-01-01

    This book offers a quick and basic guide to using SPSS and provides a general approach to solving problems using statistical tests. It is both comprehensive in terms of the tests covered and the applied settings it refers to, and yet is short and easy to understand. Whether you are a beginner or an intermediate level test user, this book will help you to analyse different types of data in applied settings. It will also give you the confidence to use other statistical software and to extend your expertise to more specific scientific settings as required.The author does not use mathematical form

  3. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  4. Statistical study of ion pitch-angle distributions

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.

    1987-01-01

    Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references

  5. A Raman lidar at La Reunion (20.8° S, 55.5° E for monitoring water vapour and cirrus distributions in the subtropical upper troposphere: preliminary analyses and description of a future system

    Directory of Open Access Journals (Sweden)

    C. Hoareau

    2012-06-01

    Full Text Available A ground-based Rayleigh lidar has provided continuous observations of tropospheric water vapour profiles and cirrus cloud using a preliminary Raman channels setup on an existing Rayleigh lidar above La Reunion over the period 2002–2005. With this instrument, we performed a first measurement campaign of 350 independent water vapour profiles. A statistical study of the distribution of water vapour profiles is presented and some investigations concerning the calibration are discussed. Analysis regarding the cirrus clouds is presented and a classification has been performed showing 3 distinct classes. Based on these results, the characteristics and the design of a future lidar system, to be implemented at the new Reunion Island altitude observatory (2200 m for long-term monitoring, is presented and numerical simulations of system performance have been realised to compare both instruments.

  6. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  7. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  8. Preliminary safety analysis of the HTTR-IS nuclear hydrogen production system

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Ohashi, Hirofumi; Tazawa, Yujiro; Tachibana, Yukio; Sakaba, Nariaki

    2010-06-01

    Japan Atomic Energy Agency is planning to demonstrate hydrogen production by thermochemical water-splitting IS process utilizing heat from the high-temperature gas-cooled reactor HTTR (HTTR-IS system). The previous study identified that the HTTR modification due to the coupling of hydrogen production plant requires an additional safety review since the scenario and quantitative values of the evaluation items would be altered from the original HTTR safety review. Hence, preliminary safety analyses are conducted by using the system analysis code. Calculation results showed that evaluation items such as a coolant pressure, temperatures of heat transfer tubes at the pressure boundary, etc., did not exceed allowable values. Also, the peak fuel temperature did not exceed allowable value and therefore the reactor core was not damaged and cooled sufficiently. This report compiles calculation conditions, event scenarios and the calculation results of the preliminary safety analysis. (author)

  9. Regulatory analyses for severe accident issues: an example

    International Nuclear Information System (INIS)

    Burke, R.P.; Strip, D.R.; Aldrich, D.C.

    1984-09-01

    This report presents the results of an effort to develop a regulatory analysis methodology and presentation format to provide information for regulatory decision-making related to severe accident issues. Insights and conclusions gained from an example analysis are presented. The example analysis draws upon information generated in several previous and current NRC research programs (the Severe Accident Risk Reduction Program (SARRP), Accident Sequence Evaluation Program (ASEP), Value-Impact Handbook, Economic Risk Analyses, and studies of Vented Containment Systems and Alternative Decay Heat Removal Systems) to perform preliminary value-impact analyses on the installation of either a vented containment system or an alternative decay heat removal system at the Peach Bottom No. 2 plant. The results presented in this report are first-cut estimates, and are presented only for illustrative purposes in the context of this document. This study should serve to focus discussion on issues relating to the type of information, the appropriate level of detail, and the presentation format which would make a regulatory analysis most useful in the decisionmaking process

  10. Statistical evaluation of PCDD/Fs levels and profiles in milk and feedingstuffs from Campania region - Italy

    Energy Technology Data Exchange (ETDEWEB)

    Diletti, G.; Scortichini, G.; Conte, A.; Migliorati, G.; Caporale, V. [Ist. Zooprofilattico Sperimentale dell' Abruzzo e del Molise (Italy)

    2004-09-15

    PCDD/Fs levels exceeding the European Union (EU) tolerance limit were detected in milk and animal feed samples collected in Campania region in the years 2001-2003, as reported in a previous paper1. The analyses were performed on milk samples from different animal species (cow, sheep, goat and buffalo) and on animal feed samples (silage, hay, grass, cereals, premixes and mixed feeds) permitting to assess the levels and the geographical extension of the contamination. The preliminary results of this survey had given clear indications of the dioxin contamination of feedingstuffs and their contribution to the high PCDD/Fs levels recorded in milk but a more detailed analysis was needed in order to confirm the previous observations. Aim of this work is the evaluation of the correlation between the PCDD/Fs levels and patterns found in milk and animal feed samples through the statistical analysis of the congeners profiles and concentrations. Moreover, the typical congeners profiles of milk samples taken in the area under investigation were compared to those obtained from samples collected in the framework of the National Residues Surveillance Plan (NRSP) in 2003. The contamination phenomenon was also studied by means of the spatial correlation analysis.

  11. Long-Term Propagation Statistics and Availability Performance Assessment for Simulated Terrestrial Hybrid FSO/RF System

    Directory of Open Access Journals (Sweden)

    Fiser Ondrej

    2011-01-01

    Full Text Available Long-term monthly and annual statistics of the attenuation of electromagnetic waves that have been obtained from 6 years of measurements on a free space optical path, 853 meters long, with a wavelength of 850 nm and on a precisely parallel radio path with a frequency of 58 GHz are presented. All the attenuation events observed are systematically classified according to the hydrometeor type causing the particular event. Monthly and yearly propagation statistics on the free space optical path and radio path are obtained. The influence of individual hydrometeors on attenuation is analysed. The obtained propagation statistics are compared to the calculated statistics using ITU-R models. The calculated attenuation statistics both at 850 nm and 58 GHz underestimate the measured statistics for higher attenuation levels. The availability performance of a simulated hybrid FSO/RF system is analysed based on the measured data.

  12. Stable isotope variations in benthic primary producers along the Bosphorus (Turkey): A preliminary study

    International Nuclear Information System (INIS)

    Calizza, Edoardo; Aktan, Yelda; Costantini, Maria Letizia; Rossi, Loreto

    2015-01-01

    Highlights: • Nitrogen pollution along the Bosphorus Strait was investigated. • C and N isotopic and elemental analyses on benthic primary producers were performed. • δ 15 N decreased, while δ 13 C and N% increased from north to south along the Strait. • Ulva lactuca was more useful than epiphytes as indicator of nitrogen pollution. • Preliminary isotopic analyses on resident organisms are useful monitoring tools. - Abstract: The Bosphorus Strait is a dynamic and complex system. Recent evidences showed nitrogen and heavy metal concentrations to follow opposite patterns across the Strait, suggesting a complex spatial organisation of the anthropogenic disturbance in this system. Here, we provide isotopic information on the origin and transportation of dissolved nitrogen along the Bosphorus. C and N isotopic and elemental analyses were performed on specimens of Ulva lactuca and associated epiphytes sampled in five locations across the Strait. Variations in C and N isotopic signatures were observed in U. lactuca, pointing to a decrease in the availability of anthropogenic organic dissolved nitrogen along a north-south direction. Conversely, epiphytes did not show isotopic or elemental patterns across the Strait. These results suggest that preliminary stable isotope surveys in extended costal systems basing on U. lactuca can represent a valuable tool to focus meaningful targets and hypotheses for pollution studies in the Mediterranean region

  13. Serum Ionized Calcium Quantification for Staging Canine Periodontal Disease: A Preliminary Study.

    Science.gov (United States)

    Miguel Carreira, L; Daniela, Dias; Pedro, Azevedo

    2015-06-01

    Periodontal diseases (PD) are infectious, inflammatory, progressive diseases of the oral cavity affecting people and dogs. PD takes 2 forms: gingivitis and periodontitis. Diagnosing or staging PD can be achieved only with dental x-rays and periodontal probing, both of which require the use of general anesthesia in dogs. This study aimed to determine whether serum ionized calcium ([iCa(2+)]) levels can be useful in preliminary PD staging in dogs. A sample of 40 dogs (n = 40) was divided into 4 groups (n = 10 each) based on the following PD stages: G1 (gingivitis), G2 (initial periodontitis), G3 (moderate periodontitis), and G4 (severe periodontitis). The groups were then subjected to [iCa(2+)] quantification. Statistically significant differences were observed between PD stages and [iCa(2+)] for all stages except G3 and G4. Therefore, this parameter can be used as an additional tool to establish and monitor preliminary PD status. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Statistical analysis of thermal conductivity of nanofluid containing ...

    Indian Academy of Sciences (India)

    Thermal conductivity measurements of nanofluids were analysed via two-factor completely randomized design and comparison of data means is carried out with Duncan's multiple-range test. Statistical analysis of experimental data show that temperature and weight fraction have a reasonable impact on the thermal ...

  15. Inferential Statistics in "Language Teaching Research": A Review and Ways Forward

    Science.gov (United States)

    Lindstromberg, Seth

    2016-01-01

    This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…

  16. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  17. Monitoring and Evaluation; Statistical Support for Life-cycle Studies, 2003 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, John

    2003-12-01

    This report summarizes the statistical analysis and consulting activities performed under Contract No. 00004134, Project No. 199105100 funded by Bonneville Power Administration during 2003. These efforts are focused on providing real-time predictions of outmigration timing, assessment of life-history performance measures, evaluation of status and trends in recovery, and guidance on the design and analysis of Columbia Basin fish and wildlife studies monitoring and evaluation studies. The overall objective of the project is to provide BPA and the rest of the fisheries community with statistical guidance on design, analysis, and interpretation of monitoring data, which will lead to improved monitoring and evaluation of salmonid mitigation programs in the Columbia/Snake River Basin. This overall goal is being accomplished by making fisheries data readily available for public scrutiny, providing statistical guidance on the design and analyses of studies by hands-on support and written documents, and providing real-time analyses of tagging results during the smolt outmigration for review by decision makers. For a decade, this project has been providing in-season projections of smolt outmigration timing to assist in spill management. As many as 50 different fish stocks at 8 different hydroprojects are tracked and real-time to predict the 'percent of run to date' and 'date to specific percentile'. The project also conducts added-value analyses of historical tagging data to understand relationships between fish responses, environmental factors, and anthropogenic effects. The statistical analysis of historical tagging data crosses agency lines in order to assimilate information on salmon population dynamics irrespective of origin. The lessons learned from past studies are used to improve the design and analyses of future monitoring and evaluation efforts. Through these efforts, the project attempts to provide the fisheries community with reliable analyses

  18. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  19. The Chandra Source Catalog: Statistical Characterization

    Science.gov (United States)

    Primini, Francis A.; Nowak, M. A.; Houck, J. C.; Davis, J. E.; Glotfelty, K. J.; Karovska, M.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Evans, I. N.; Evans, J. D.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) will ultimately contain more than ˜250000 x-ray sources in a total area of ˜1% of the entire sky, using data from ˜10000 separate ACIS and HRC observations of a multitude of different types of x-ray sources (see Evans et al. this conference). In order to maximize the scientific benefit of such a large, heterogeneous dataset, careful characterization of the statistical properties of the catalog, i.e., completeness, sensitivity, false source rate, and accuracy of source properties, is required. Our Characterization efforts include both extensive simulations of blank-sky and point source datasets, and detailed comparisons of CSC results with those of other x-ray and optical catalogs. We present here a summary of our characterization results for CSC Release 1 and preliminary plans for future releases. This work is supported by NASA contract NAS8-03060 (CXC).

  20. Microvariability in AGNs: study of different statistical methods - I. Observational analysis

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Carpintero, D. D.; Romero, G. E.; Combi, J. A.

    2017-05-01

    We present the results of a study of different statistical methods currently used in the literature to analyse the (micro)variability of active galactic nuclei (AGNs) from ground-based optical observations. In particular, we focus on the comparison between the results obtained by applying the so-called C and F statistics, which are based on the ratio of standard deviations and variances, respectively. The motivation for this is that the implementation of these methods leads to different and contradictory results, making the variability classification of the light curves of a certain source dependent on the statistics implemented. For this purpose, we re-analyse the results on an AGN sample observed along several sessions with the 2.15 m 'Jorge Sahade' telescope (CASLEO), San Juan, Argentina. For each AGN, we constructed the nightly differential light curves. We thus obtained a total of 78 light curves for 39 AGNs, and we then applied the statistical tests mentioned above, in order to re-classify the variability state of these light curves and in an attempt to find the suitable statistical methodology to study photometric (micro)variations. We conclude that, although the C criterion is not proper a statistical test, it could still be a suitable parameter to detect variability and that its application allows us to get more reliable variability results, in contrast with the F test.

  1. Automated material accounting statistics system (AMASS)

    International Nuclear Information System (INIS)

    Messinger, M.; Lumb, R.F.; Tingey, F.H.

    1981-01-01

    In this paper the modeling and statistical analysis of measurement and process data for nuclear material accountability is readdressed under a more general framework than that provided in the literature. The result of this effort is a computer program (AMASS) which uses the algorithms and equations of this paper to accomplish the analyses indicated. The actual application of the method to process data is emphasized

  2. A statistical analysis of electrical cerebral activity; Contribution a l'etude de l'analyse statistique de l'activite electrique cerebrale

    Energy Technology Data Exchange (ETDEWEB)

    Bassant, Marie-Helene

    1971-01-15

    The aim of this work was to study the statistical properties of the amplitude of the electroencephalographic signal. The experimental method is described (implantation of electrodes, acquisition and treatment of data). The program of the mathematical analysis is given (calculation of probability density functions, study of stationarity) and the validity of the tests discussed. The results concerned ten rabbits. Trips of EEG were sampled during 40 s. with very short intervals (500 μs). The probability density functions established for different brain structures (especially the dorsal hippocampus) and areas, were compared during sleep, arousal and visual stimulus. Using a Χ{sup 2} test, it was found that the Gaussian distribution assumption was rejected in 96.7 per cent of the cases. For a given physiological state, there was no mathematical reason to reject the assumption of stationarity (in 96 per cent of the cases). (author) [French] Le but de ce travail est d'etudier les proprietes statistiques des amplitudes du signal electroencephalographique. La methode experimentale est decrite (implantation d'electrodes, acquisition et traitement des donnees). Le programme d'analyse mathematique est precise (calcul des courbes de repartition statistique, etude de la stationnarite du signal) et la validite des tests, discutee. Les resultats de l'etude portent sur 10 lapins. Des sequences de 40 s d'EEG sont echantillonnees. La valeur de la tension est prelevee a un pas d'echantillonnage de 500 μs. Les courbes de repartition statistiques sont comparees d'une region de l'encephale a l'autre (l'hippocampe dorsal a ete specialement analyse) ceci pendant le sommeil, l'eveil et des stimulations visuelles. Le test du Χ{sup 2} rejette l'hypothese de distribution normale dans 97 pour cent des cas. Pour un etat physiologique donne, il n'existe pas de raison mathematique a ce que soit repoussee l'hypothese de stationnarite, ceci dans 96.7 pour cent des cas. (auteur)

  3. Statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.

    1975-10-01

    This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described

  4. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  5. PRELIMINARY FACTOR ANALYSIS OF VISUAL COGNITION AND MEMORY. STUDIES IN CINE-PSYCHOMETRY, FINAL REPORT, PART I.

    Science.gov (United States)

    SEIBERT, WARREN F.; AND OTHERS

    PRELIMINARY ANALYSES WERE UNDERTAKEN TO DETERMINE THE POTENTIAL CONTRIBUTION OF MOTION PICTURE FILMS TO FACTOR ANALYTIC STUDIES OF HUMAN INTELLECT. OF PRIMARY CONCERN WERE THE OPERATIONS OF COGNITION AND MEMORY, FORMING TWO OF THE FIVE OPERATION COLUMNS OF GUILFORD'S "STRUCTURE OF INTELLECT." THE CORE REFERENCE FOR THE STUDY WAS DEFINED…

  6. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    Science.gov (United States)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  7. Book Trade Research and Statistics. Prices of U.S. and Foreign Published Materials; Book Title Output and Average Prices: 2001 Final and 2002 Preliminary Figures; Book Sales Statistics, 2002: AAP Preliminary Estimates; U.S. Book Exports and Imports:2002; Number of Book Outlets in the United States and Canada; Review Media Statistics.

    Science.gov (United States)

    Sullivan, Sharon G.; Grabois, Andrew; Greco, Albert N.

    2003-01-01

    Includes six reports related to book trade statistics, including prices of U.S. and foreign materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and numbers of books and other media reviewed by major reviewing publications. (LRW)

  8. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  9. Quantile regression for the statistical analysis of immunological data with many non-detects

    NARCIS (Netherlands)

    Eilers, P.H.C.; Roder, E.; Savelkoul, H.F.J.; Wijk, van R.G.

    2012-01-01

    Background Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical

  10. Development of a statistical shape model of multi-organ and its performance evaluation

    International Nuclear Information System (INIS)

    Nakada, Misaki; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2010-01-01

    Existing statistical shape modeling methods for an organ can not take into account the correlation between neighboring organs. This study focuses on a level set distribution model and proposes two modeling methods for multiple organs that can take into account the correlation between neighboring organs. The first method combines level set functions of multiple organs into a vector. Subsequently it analyses the distribution of the vectors of a training dataset by a principal component analysis and builds a multiple statistical shape model. Second method constructs a statistical shape model for each organ independently and assembles component scores of different organs in a training dataset so as to generate a vector. It analyses the distribution of the vectors of to build a statistical shape model of multiple organs. This paper shows results of applying the proposed methods trained by 15 abdominal CT volumes to unknown 8 CT volumes. (author)

  11. Which statistics should tropical biologists learn?

    Science.gov (United States)

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  12. HLA region excluded by linkage analyses of early onset periodontitis

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.; Wang, S.; Lopez, N.

    1994-09-01

    Previous studies suggested that HLA genes may influence susceptibility to early-onset periodontitis (EOP). Segregation analyses indicate that EOP may be due to a single major gene. We conducted linkage analyses to assess possible HLA effects on EOP. Fifty families with two or more close relatives affected by EOP were ascertained in Virginia and Chile. A microsatellite polymorphism within the HLA region (at the tumor necrosis factor beta locus) was typed using PCR. Linkage analyses used a donimant model most strongly supported by previous studies. Assuming locus homogeneity, our results exclude a susceptibility gene within 10 cM on either side of our marker locus. This encompasses all of the HLA region. Analyses assuming alternative models gave qualitatively similar results. Allowing for locus heterogeneity, our data still provide no support for HLA-region involvement. However, our data do not statistically exclude (LOD <-2.0) hypotheses of disease-locus heterogeneity, including models where up to half of our families could contain an EOP disease gene located in the HLA region. This is due to the limited power of even our relatively large collection of families and the inherent difficulties of mapping genes for disorders that have complex and heterogeneous etiologies. Additional statistical analyses, recruitment of families, and typing of flanking DNA markers are planned to more conclusively address these issues with respect to the HLA region and other candidate locations in the human genome. Additional results for markers covering most of the human genome will also be presented.

  13. BIPS-FS preliminary design, miscellaneous notes

    International Nuclear Information System (INIS)

    1976-01-01

    A compendium of flight system preliminary design internal memos and progress report extracts for the Brayton Isotope Power System Preliminary Design Review to be held July 20, 21, and 22, 1975 is presented. The purpose is to bring together those published items which relate only to the preliminary design of the Flight System, Task 2 of Phase I. This preliminary design effort was required to ensure that the Ground Demonstration System will represent the Flight System as closely as possible

  14. Robustness assessments are needed to reduce bias in meta-analyses that include zero-event randomized trials

    DEFF Research Database (Denmark)

    Keus, F; Wetterslev, J; Gluud, C

    2009-01-01

    of statistical method on inference. RESULTS: In seven meta-analyses of seven outcomes from 15 trials, there were zero-event trials in 0 to 71.4% of the trials. We found inconsistency in significance in one of seven outcomes (14%; 95% confidence limit 0.4%-57.9%). There was also considerable variability......OBJECTIVES: Meta-analysis of randomized trials with binary data can use a variety of statistical methods. Zero-event trials may create analytic problems. We explored how different methods may impact inferences from meta-analyses containing zero-event trials. METHODS: Five levels of statistical...... methods are identified for meta-analysis with zero-event trials, leading to numerous data analyses. We used the binary outcomes from our Cochrane review of randomized trials of laparoscopic vs. small-incision cholecystectomy for patients with symptomatic cholecystolithiasis to illustrate the influence...

  15. 45 CFR 150.217 - Preliminary determination.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Preliminary determination. 150.217 Section 150.217... Are Failing To Substantially Enforce HIPAA Requirements § 150.217 Preliminary determination. If, at... designees). (b) Notifies the State of CMS's preliminary determination that the State has failed to...

  16. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    Science.gov (United States)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  17. Second preliminary design of JAERI experimental fusion reactor (JXFR)

    International Nuclear Information System (INIS)

    Sako, Kiyoshi; Tone, Tatsuzo; Seki, Yasushi; Iida, Hiromasa; Yamato, Harumi

    1979-06-01

    Second preliminary design of a tokamak experimental fusion reactor to be built in the near future has been performed. This design covers overall reactor system including plasma characteristics, reactor structure, blanket neutronics radiation shielding, superconducting magnets, neutral beam injector, electric power supply system, fuel recirculating system, reactor cooling and tritium recovery systems and maintenance scheme. Safety analyses of the reactor system have been also performed. This paper gives a brief description of the design as of January, 1979. The feasibility study of raising the power density has been also studied and is shown as appendix. (author)

  18. The prevalence of asthma work relatedness: Preliminary data

    Directory of Open Access Journals (Sweden)

    Wojciech Dudek

    2015-12-01

    Full Text Available Objectives: About 5–10% of asthmatics do not respond well to standard treatment plan. Occupational exposure may be one of the factors that can be linked with treatment failure. The aim of the study was to assess the prevalence of work-related asthma (WRA among adult asthmatics under follow up in an outpatient allergy clinic and to create a useful tool for detecting individuals with possible WRA. Material and Methods: Preliminary 5-question questionnaire designed to recognize WRA was presented to 300 asthmatics. All patients with positive preliminary verification along with 50 subjects from control group were asked to fill up a detailed questionnaire. The WRA was diagnosed by positive match for asthma symptoms in combination with workplace exposure indicated in the detailed WRA questionnaire followed by confirmation of each WRA case by detailed exposure analysis. Results: Work-related asthma was recognized in 63 subjects (21% of study group. The preliminary questionnaire has 76.9% sensitivity and 94% specificity in recognition of WRA. Occupational exposure to irritants is a risk factor of WRA recognition (relative risk (RR = 2.09 (1.44:3.03. Working in exposure-free environment is a factor against WRA recognition (RR = 0.38 (0.24:0.61. Among subjects with work-related asthma, the uncontrolled course of the disease is significantly more frequent (p = 0.012. Subjects with WRA more often report sickness absenteeism due to asthma than those without WRA (9.6% vs. 3.2%, respectively, but the observed differences did not reach the statistical significance. Conclusions: Short 5-question questionnaire seems to be a promising tool to detect individuals with possible work-related asthma in the outpatient setting for further evaluation and additional attention.

  19. Quality control and conduct of genome-wide association meta-analyses

    DEFF Research Database (Denmark)

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC...

  20. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  1. Preliminary Survey on Consumption of Moringa Products for ...

    African Journals Online (AJOL)

    Descriptive statistics, Binary Logistic Regression and non-parametric correlation analyses were employed to achieve the study's objectives. The results indicated that a fairly high proportion of the respondents (48%) had used Moringa products for its claimed Nutraceutical benefits. Lack of awareness was a major barrier to ...

  2. Evaluation and application of summary statistic imputation to discover new height-associated loci.

    Science.gov (United States)

    Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán

    2018-05-01

    As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian

  3. Exploratory shaft facility preliminary designs - Permian Basin

    International Nuclear Information System (INIS)

    1983-09-01

    The purpose of the Preliminary Design Report, Permian Basin, is to provide a description of the preliminary design for an Exploratory Shaft Facility in the Permian Basin, Texas. This issue of the report describes the preliminary design for constructing the exploratory shaft using the Large Hole Drilling method of construction and outlines the preliminary design and estimates of probable construction cost. The Preliminary Design Report is prepared to complement and summarize other documents that comprise the design at the preliminary stage of completion, December 1982. Other design documents include drawings, cost estimates and schedules. The preliminary design drawing package, which includes the construction schedule drawing, depicts the descriptions in this report. For reference, a list of the drawing titles and corresponding numbers are included in the Appendix. The report is divided into three principal sections: Design Basis, Facility Description, and Construction Cost Estimate. 30 references, 13 tables

  4. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  5. Statistical analysis of the potassium concentration obtained through

    International Nuclear Information System (INIS)

    Pereira, Joao Eduardo da Silva; Silva, Jose Luiz Silverio da; Pires, Carlos Alberto da Fonseca; Strieder, Adelir Jose

    2007-01-01

    The present work was developed in outcrops of Santa Maria region, southern Brazil, Rio Grande do Sul State. Statistic evaluations were applied in different rock types. The possibility to distinguish different geologic units, sedimentary and volcanic (acid and basic types) by means of the statistic analyses from the use of airborne gamma-ray spectrometry integrating potash radiation emissions data with geological and geochemistry data is discussed. This Project was carried out at 1973 by Geological Survey of Brazil/Companhia de Pesquisas de Recursos Minerais. The Camaqua Project evaluated the behavior of potash concentrations generating XYZ Geosof 1997 format, one grid, thematic map and digital thematic map files from this total area. Using these data base, the integration of statistics analyses in sedimentary formations which belong to the Depressao Central do Rio Grande do Sul and/or to volcanic rocks from Planalto da Serra Geral at the border of Parana Basin was tested. Univariate statistics model was used: the media, the standard media error, and the trust limits were estimated. The Tukey's Test was used in order to compare mean values. The results allowed to create criteria to distinguish geological formations based on their potash content. The back-calibration technique was employed to transform K radiation to percentage. Inside this context it was possible to define characteristic values from radioactive potash emissions and their trust ranges in relation to geologic formations. The potash variable when evaluated in relation to geographic Universal Transverse Mercator coordinates system showed a spatial relation following one polynomial model of second order, with one determination coefficient. The statistica 7.1 software Generalist Linear Models produced by Statistics Department of Federal University of Santa Maria/Brazil was used. (author)

  6. USE OF THE SIMPLE LINEAR REGRESSION MODEL IN MACRO-ECONOMICAL ANALYSES

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2011-10-01

    Full Text Available The article presents the fundamental aspects of the linear regression, as a toolbox which can be used in macroeconomic analyses. The article describes the estimation of the parameters, the statistical tests used, the homoscesasticity and heteroskedasticity. The use of econometrics instrument in macroeconomics is an important factor that guarantees the quality of the models, analyses, results and possible interpretation that can be drawn at this level.

  7. Reporting the results of meta-analyses: a plea for incorporating clinical relevance referring to an example.

    Science.gov (United States)

    Bartels, Ronald H M A; Donk, Roland D; Verhagen, Wim I M; Hosman, Allard J F; Verbeek, André L M

    2017-11-01

    The results of meta-analyses are frequently reported, but understanding and interpreting them is difficult for both clinicians and patients. Statistical significances are presented without referring to values that imply clinical relevance. This study aimed to use the minimal clinically important difference (MCID) to rate the clinical relevance of a meta-analysis. This study is a review of the literature. This study is a review of meta-analyses relating to a specific topic, clinical results of cervical arthroplasty. The outcome measure used in the study was the MCID. We performed an extensive literature search of a series of meta-analyses evaluating a similar subject as an example. We searched in Pubmed and Embase through August 9, 2016, and found articles concerning meta-analyses of the clinical outcome of cervical arthroplasty compared with that of anterior cervical discectomy with fusion in cases of cervical degenerative disease. We evaluated the analyses for statistical significance and their relation to MCID. MCID was defined based on results in similar patient groups and a similar disease entity reported in the literature. We identified 21 meta-analyses, only one of which referred to MCID. However, the researchers used an inappropriate measurement scale and, therefore, an incorrect MCID. The majority of the conclusions were based on statistical results without mentioning clinical relevance. The majority of the articles we reviewed drew conclusions based on statistical differences instead of clinical relevance. We recommend introducing the concept of MCID while reporting the results of a meta-analysis, as well as mentioning the explicit scale of the analyzed measurement. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Progressive statistics for studies in sports medicine and exercise science.

    Science.gov (United States)

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  9. Preliminary electrostatic and mechanical design of a SINGAP-MAMuG compatible accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Grando, L. [Consorzio RFX, Associazione EURATOM-ENEA sulla Fusione, Corso Stati Uniti 4, I-35127 Padova (Italy)], E-mail: luca.grando@igi.cnr.it; Dal Bello, S.; De Lorenzi, A. [Consorzio RFX, Associazione EURATOM-ENEA sulla Fusione, Corso Stati Uniti 4, I-35127 Padova (Italy); Pilan, N. [DIE, Universita di Padova, Via Gradenigo 6A, I-35100 Padova (Italy); Rizzolo, A.; Zaccaria, P. [Consorzio RFX, Associazione EURATOM-ENEA sulla Fusione, Corso Stati Uniti 4, I-35127 Padova (Italy)

    2009-06-15

    Each ITER NB injector shall provide 16.5 MW auxiliary power by accelerating a deuterium beam across a voltage of -1 MV. At present two possible alternatives for the accelerator are considered: the reference design, based on MAMuG electrostatic accelerator, where the total voltage is graded using five grids at intermediate steps of 200 kV, and the alternative concept, the SINGAP accelerator, for which the total voltage is held by one single gap. This paper focuses a preliminary feasibility study of integration of SINGAP accelerator grids into the support structure of a MAMuG type accelerator; the review or design of new electrostatic shields to improve the voltage withstanding capability of the system and the preliminary design of electrical and hydraulic connections routing from the bushing to the accelerator are also discussed. Electrostatic and mechanical analyses carried out to support the design are described in detail.

  10. Stuttering on function words in bilingual children who stutter: A preliminary study.

    Science.gov (United States)

    Gkalitsiou, Zoi; Byrd, Courtney T; Bedore, Lisa M; Taliancich-Klinger, Casey L

    2017-01-01

    Evidence suggests young monolingual children who stutter (CWS) are more disfluent on function than content words, particularly when produced in the initial utterance position. The purpose of the present preliminary study was to investigate whether young bilingual CWS present with this same pattern. The narrative and conversational samples of four bilingual Spanish- and English-speaking CWS were analysed. All four bilingual participants produced significantly more stuttering on function words compared to content words, irrespective of their position in the utterance, in their Spanish narrative and conversational speech samples. Three of the four participants also demonstrated more stuttering on function compared to content words in their narrative speech samples in English, but only one participant produced more stuttering on function than content words in her English conversational sample. These preliminary findings are discussed relative to linguistic planning and language proficiency and their potential contribution to stuttered speech.

  11. Preliminary safety evaluation, based on initial site investigation data. Planning document

    International Nuclear Information System (INIS)

    Hedin, Allan

    2002-12-01

    This report is a planning document for the preliminary safety evaluations (PSE) to be carried out at the end of the initial stage of SKBs ongoing site investigations for a deep repository for spent nuclear fuel. The main purposes of the evaluations are to determine whether earlier judgements of the suitability of the candidate area for a deep repository with respect to long-term safety holds up in the light of borehole data and to provide feed-back to continued site investigations and site specific repository design. The preliminary safety evaluations will be carried out by a safety assessment group, based on a site model, being part of a site description, provided by a site modelling group and a repository layout within that model suggested by a repository engineering group. The site model contains the geometric features of the site as well as properties of the host rock. Several alternative interpretations of the site data will likely be suggested. Also the biosphere is included in the site model. A first task for the PSE will be to compare the rock properties described in the site model to previously established criteria for a suitable host rock. This report gives an example of such a comparison. In order to provide more detailed feedback, a number of thermal, hydrological, mechanical and chemical analyses of the site will also be included in the evaluation. The selection of analyses is derived from the set of geosphere and biosphere analyses preliminarily planned for the comprehensive safety assessment named SR-SITE, which will be based on a complete site investigation. The selection is dictated primarily by the expected feedback to continued site investigations and by the availability of data after the PSE. The repository engineering group will consider several safety related factors in suggesting a repository layout: Thermal calculations will be made to determine a minimum distance between canisters avoiding canister surface temperatures above 100 deg C

  12. Preliminary study of the thermo-hydraulic behaviour of the binary breeder reactor

    International Nuclear Information System (INIS)

    Silveira Luz, M. da; Ferreira, W.J.

    1984-06-01

    Continuing the development of the Binary Breeder Reactor, its physical configuration and the advantages of differents types of spacers are analysed. In order to simulate the thermo-hydraulic behaviour and obtain data for a preliminary evaluation of the core geometry, the COBRA III C code was used to study the effects of the lenght and diameter of the fuel element, the coolant inlet temperature, the system pressure, helicoidal pitch and the pitch to diameter ratio. (Author) [pt

  13. The practical impact of differential item functioning analyses in a health-related quality of life instrument

    DEFF Research Database (Denmark)

    Scott, Neil W; Fayers, Peter M; Aaronson, Neil K

    2009-01-01

    Differential item functioning (DIF) analyses are commonly used to evaluate health-related quality of life (HRQoL) instruments. There is, however, a lack of consensus as to how to assess the practical impact of statistically significant DIF results.......Differential item functioning (DIF) analyses are commonly used to evaluate health-related quality of life (HRQoL) instruments. There is, however, a lack of consensus as to how to assess the practical impact of statistically significant DIF results....

  14. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Science.gov (United States)

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  15. Conceptual and statistical problems associated with the use of diversity indices in ecology.

    Science.gov (United States)

    Barrantes, Gilbert; Sandoval, Luis

    2009-09-01

    Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.

  16. Joint U.S./Russian Study on the Development of a Preliminary Cost Estimate of the SAFSTOR Decommissioning Alternative for the Leningrad Nuclear Power Plant Unit #1

    Energy Technology Data Exchange (ETDEWEB)

    SM Garrett

    1998-09-28

    The objectives of the two joint Russian/U.S. Leningrad Nuclear Power Plant (NPP) Unit #1 studies were the development of a safe, technically feasible, economically acceptable decom missioning strategy, and the preliminary cost evaluation of the developed strategy. The first study, resulting in the decommissioning strategy, was performed in 1996 and 1997. The preliminary cost estimation study, described in this report, was performed in 1997 and 1998. The decommissioning strategy study included the analyses of three basic RBM.K decommission- ing alternatives, refined for the Leningrad NPP Unit #1. The analyses included analysis of the requirements for the planning and preparation as well as the decommissioning phases.

  17. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  18. Investigating salt frost scaling by using statistical methods

    DEFF Research Database (Denmark)

    Hasholt, Marianne Tange; Clemmensen, Line Katrine Harder

    2010-01-01

    A large data set comprising data for 118 concrete mixes on mix design, air void structure, and the outcome of freeze/thaw testing according to SS 13 72 44 has been analysed by use of statistical methods. The results show that with regard to mix composition, the most important parameter...

  19. Preliminary feasibility assessment for Earth-to-space electromagnetic (Railgun) launchers

    Science.gov (United States)

    Rice, E. E.; Miller, L. A.; Earhart, R. W.

    1982-01-01

    An Earth to space electromagnetic (railgun) launcher (ESRL) for launching material into space was studied. Potential ESRL applications were identified and initially assessed to formulate preliminary system requirements. The potential applications included nuclear waste disposal in space, Earth orbital applications, deep space probe launchers, atmospheric research, and boost of chemical rockets. The ESRL system concept consisted of two separate railgun launcher tubes (one at 20 deg from the horizontal for Earth orbital missions, the other vertical for solar system escape disposal missions) powered by a common power plant. Each 2040 m launcher tube is surrounded by 10,200 homopolar generator/inductor units to transmit the power to the walls. Projectile masses are 6500 kg for Earth orbital missions and 2055 kg for nuclear waste disposal missions. For the Earth orbital missions, the projectile requires a propulsion system, leaving an estimated payload mass of 650 kg. For the nuclear waste disposal in space mission, the high level waste mass was estimated at 250 kg. This preliminary assessment included technical, environmental, and economic analyses.

  20. Faculty Library and Its Users (Interactively and Statistically

    Directory of Open Access Journals (Sweden)

    Rozalija Marinković

    2011-12-01

    Full Text Available According to popular opinion, statistics is the sum of incorrect data. However, much can be learned and concluded from statistical reports. In our example of recording statistics in the library of the Faculty of Economics in Osijek the following records are kept: user categories, the frequency of borrowing, the structure of borrowing, use of the reading room, use of the database, etc. Based on these data and statistical processing, the following can be analysed: the total number of users, the number of users by the year of study, by the content of used professional and scientific literature, by the use of serial publications in the reading room, or by the work on computers and database usage. All these data are precious for determining the procurement policy of the library, observing frequency of use of particular categories of professional and scientific literature (textbooks, handbooks, professional and scientific monographs, databases and determining operation guidelines for library staff.

  1. Steady-state and accident analyses of PBMR with the computer code SPECTRA

    International Nuclear Information System (INIS)

    Stempniewicz, Marek M.

    2002-01-01

    The SPECTRA code is an accident analysis code developed at NRG. It is designed for thermal-hydraulic analyses of nuclear or conventional power plants. The code is capable of analysing the whole power plant, including reactor vessel, primary system, various control and safety systems, containment and reactor building. The aim of the work presented in this paper was to prepare a preliminary thermal-hydraulic model of PBMR for SPECTRA, and perform steady state and accident analyses. In order to assess SPECTRA capability to model the PBMR reactors, a model of the INCOGEN system has been prepared first. Steady state and accident scenarios were analyzed for INCOGEN configuration. Results were compared to the results obtained earlier with INAS and OCTOPUS/PANTHERMIX. A good agreement was obtained. Results of accident analyses with PBMR model showed qualitatively good results. It is concluded that SPECTRA is a suitable tool for analyzing High Temperature Reactors, such as INCOGEN or for example PBMR (Pebble Bed Modular Reactor). Analyses of INCOGEN and PBMR systems showed that in all analyzed cases the fuel temperatures remained within the acceptable limits. Consequently there is no danger of release of radioactivity to the environment. It may be concluded that those are promising designs for future safe industrial reactors. (author)

  2. Statistical analysis of manufacturing defects on fatigue life of wind turbine casted Component

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei; Sørensen, John Dalsgaard; Mukherjee, Krishnendu

    2014-01-01

    Wind turbine components experience heavily variable loads during its lifetime and fatigue failure is a main failure mode of casted components during their design working life. The fatigue life is highly dependent on the microstructure (grain size and graphite form and size), number, type, location...... and size of defects in the casted components and is therefore rather uncertain and needs to be described by stochastic models. Uncertainties related to such defects influence prediction of the fatigue strengths and are therefore important in modelling and assessment of the reliability of wind turbine...... for the fatigue life, namely LogNormal and Weibull distributions. The statistical analyses are performed using the Maximum Likelihood Method and the statistical uncertainty is estimated. Further, stochastic models for the fatigue life obtained from the statistical analyses are used for illustration to assess...

  3. Nuclear modification factor using Tsallis non-extensive statistics

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, Sushanta; Garg, Prakhar; Kumar, Prateek; Sahoo, Raghunath [Indian Institute of Technology Indore, Discipline of Physics, School of Basic Sciences, Simrol (India); Bhattacharyya, Trambak; Cleymans, Jean [University of Cape Town, UCT-CERN Research Centre and Department of Physics, Rondebosch (South Africa)

    2016-09-15

    The nuclear modification factor is derived using Tsallis non-extensive statistics in relaxation time approximation. The variation of the nuclear modification factor with transverse momentum for different values of the non-extensive parameter, q, is also observed. The experimental data from RHIC and LHC are analysed in the framework of Tsallis non-extensive statistics in a relaxation time approximation. It is shown that the proposed approach explains the R{sub AA} of all particles over a wide range of transverse momentum but does not seem to describe the rise in R{sub AA} at very high transverse momenta. (orig.)

  4. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  5. Probabilistic evaluation of scenarios in long-term safety analyses. Results of the project ISIBEL; Probabilistische Bewertung von Szenarien in Langzeitsicherheitsanalysen. Ergebnisse des Vorhabens ISIBEL

    Energy Technology Data Exchange (ETDEWEB)

    Buhmann, Dieter; Becker, Dirk-Alexander; Laggiard, Eduardo; Ruebel, Andre; Spiessl, Sabine; Wolf, Jens

    2016-07-15

    In the frame of the project ISIBEL deterministic analyses on the radiological consequences of several possible developments of the final repository were performed (VSG: preliminary safety analysis of the site Gorleben). The report describes the probabilistic evaluation of the VSG scenarios using uncertainty and sensitivity analyses. It was shown that probabilistic analyses are important to evaluate the influence of uncertainties. The transfer of the selected scenarios in computational cases and the used modeling parameters are discussed.

  6. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  7. Mathematical background and attitudes toward statistics in a sample of Spanish college students.

    Science.gov (United States)

    Carmona, José; Martínez, Rafael J; Sánchez, Manuel

    2005-08-01

    To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.

  8. Chemical data and statistical analyses from a uranium hydrogeochemical survey of the Rio Ojo Caliente drainage basin, New Mexico. Part I. Water

    International Nuclear Information System (INIS)

    Wenrich-Verbeek, K.J.; Suits, V.J.

    1979-01-01

    This report presents the chemical analyses and statistical evaluation of 62 water samples collected in the north-central part of New Mexico near Rio Ojo Caliente. Both spring and surface-water samples were taken throughout the Rio Ojo Caliente drainage basin above and a few miles below the town of La Madera. A high U concentration (15 μg/l) found in the water of the Rio Ojo Caliente near La Madera, Rio Arriba County, New Mexico, during a regional sampling-technique study in August 1975 by the senior author, was investigated further in May 1976 to determine whether stream waters could be effectively used to trace the source of a U anomaly. A detailed study of the tributaries to the Rio Ojo Caliente, involving 29 samples, was conducted during a moderate discharge period, May 1976, so that small tributaries would contain water. This study isolated Canada de la Cueva as the tributary contributing the anomalous U, so that in May 1977, an extremely low discharge period due to the 1977 drought, an additional 33 samples were taken to further define the anomalous area. 6 references, 3 figures, 6 tables

  9. Family resilience – definition of construct and preliminary results of the Polish adaptation of the Family Resilience Assessment Scale (FRAS

    Directory of Open Access Journals (Sweden)

    Natalia Nadrowska

    2017-05-01

    Full Text Available Background The article describes construct of family resilience with the main focus on the model Walsh. The aim of this article is to present preliminary results: adaptation, reliability, statistical analyses of the Family Resilience Assessment Scale (FRAS for the Polish population. Participants and procedure Participants (n = 329, aged 18-35, completed experimental Polish version of the FRAS (SPR – Skala Prężności Rodzinnej. In the procedure of adaptation, scale was translated and modified into Polish. Scale consists of the following subscales: Family Communication and Problem Solving, Utilizing Social and Economic Resources, Maintaining a Positive Outlook, Family Connectedness, Family Spirituality and Ability to Make Meaning of Adversity. Results The reliability of the experimental Polish version of the FRAS for the entire scale and five subscales are satisfactory. Only subscale Ability to Make Meaning of Adversity obtained reliability of less than 0.7. Taking into account the diversity of gender and declaring the passage through the difficult events were observed significant differences in the three scales: Family Communication and Problem Solving, Family Connectedness, Ability to Make Meaning of Adversity and total scale of FRAS. Conclusions The work on the questionnaire is still in progress and the results presented here should be considered as preliminary. In the next steps, the number of men should be increased in order to perform confirmatory factor analysis. Future studies should take into account a number of factors and contexts (e.g. family structure, social and cultural context and the type of stressful event.

  10. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  11. Analysis and classification of ECG-waves and rhythms using circular statistics and vector strength

    Directory of Open Access Journals (Sweden)

    Janßen Jan-Dirk

    2017-09-01

    Full Text Available The most common way to analyse heart rhythm is to calculate the RR-interval and the heart rate variability. For further evaluation, descriptive statistics are often used. Here we introduce a new and more natural heart rhythm analysis tool that is based on circular statistics and vector strength. Vector strength is a tool to measure the periodicity or lack of periodicity of a signal. We divide the signal into non-overlapping window segments and project the detected R-waves around the unit circle using the complex exponential function and the median RR-interval. In addition, we calculate the vector strength and apply circular statistics as wells as an angular histogram on the R-wave vectors. This approach enables an intuitive visualization and analysis of rhythmicity. Our results show that ECG-waves and rhythms can be easily visualized, analysed and classified by circular statistics and vector strength.

  12. Preliminary Concept of Operations for a Global Cylinder Identification and Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Whitaker, J. M. [ORNL; White-Horton, J. L. [ORNL; Morgan, J. B. [InSolves Associates

    2013-08-01

    This report describes a preliminary concept of operations for a Global Cylinder Identification and Monitoring System that could improve the efficiency of the International Atomic Energy Agency (IAEA) in conducting its current inspection activities and could provide a capability to substantially increase its ability to detect credible diversion scenarios and undeclared production pathways involving UF6 cylinders. There exist concerns that a proliferant State with access to enrichment technology could obtain a cylinder containing natural or low-enriched uranium hexafluoride (UF6) and produce a significant quantity (SQ)1 of highly enriched uranium in as little as 30 days. The National Nuclear Security Administration (NNSA) through the Next Generation Safeguards Initiative sponsored a multi-laboratory team to develop an integrated system that provides for detecting scenarios involving 1) diverting an entire declared cylinder for enrichment at a clandestine facility, 2) misusing a declared cylinder at a safeguarded facility, and 3) using an undeclared cylinder at a safeguarded facility. An important objective in developing this integrated system was to improve the timeliness for detecting the cylinder diversion and undeclared production scenarios. Developing this preliminary concept required in-depth analyses of current operational and safeguards practices at conversion, enrichment, and fuel fabrication facilities. The analyses evaluated the processing, movement, and storage of cylinders at the facilities; the movement of cylinders between facilities (including cylinder fabrication); and the misuse of safeguarded facilities.

  13. Predicting Geomorphic and Hydrologic Risks after Wildfire Using Harmonic and Stochastic Analyses

    Science.gov (United States)

    Mikesell, J.; Kinoshita, A. M.; Florsheim, J. L.; Chin, A.; Nourbakhshbeidokhti, S.

    2017-12-01

    Wildfire is a landscape-scale disturbance that often alters hydrological processes and sediment flux during subsequent storms. Vegetation loss from wildfires induce changes to sediment supply such as channel erosion and sedimentation and streamflow magnitude or flooding. These changes enhance downstream hazards, threatening human populations and physical aquatic habitat over various time scales. Using Williams Canyon, a basin burned by the Waldo Canyon Fire (2012) as a case study, we utilize deterministic and statistical modeling methods (Fourier series and first order Markov chain) to assess pre- and post-fire geomorphic and hydrologic characteristics, including of precipitation, enhanced vegetation index (EVI, a satellite-based proxy of vegetation biomass), streamflow, and sediment flux. Local precipitation, terrestrial Light Detection and Ranging (LiDAR) scanning, and satellite-based products are used for these time series analyses. We present a framework to assess variability of periodic and nonperiodic climatic and multivariate trends to inform development of a post-wildfire risk assessment methodology. To establish the extent to which a wildfire affects hydrologic and geomorphic patterns, a Fourier series was used to fit pre- and post-fire geomorphic and hydrologic characteristics to yearly temporal cycles and subcycles of 6, 4, 3, and 2.4 months. These cycles were analyzed using least-squares estimates of the harmonic coefficients or amplitudes of each sub-cycle's contribution to fit the overall behavior of a Fourier series. The stochastic variances of these characteristics were analyzed by composing first-order Markov models and probabilistic analysis through direct likelihood estimates. Preliminary results highlight an increased dependence of monthly post-fire hydrologic characteristics on 12 and 6-month temporal cycles. This statistical and probabilistic analysis provides a basis to determine the impact of wildfires on the temporal dependence of

  14. Preliminary Evaluation Of Commercial Supercapacitors For Space Applications

    Science.gov (United States)

    Gineste, Valery; Loup, Didier; Mattesco, Patrick; Neugnot, Nicolas

    2011-10-01

    Supercapacitors are identified since years as a new technology enabling energy storage together with high power delivery capability to the system. A recent ESA study [1] led by Astrium has demonstrated the interest of these devices for space application, providing that reliability and end of life performances are demonstrated. A realistic commercial on the shelf (COTS) approach (or with limited design modification approved by potential suppliers) has been favoured (as for batteries). This paper presents preliminary test results done by Astrium on COTS supercapacitors: accelerated life tests, calendar life tests, technology analyses. Based on these results, assessment and lessons learnt are drawn in view of future exhaustive supercapacitor validation and future qualification.

  15. Preliminary nuclear power reactor technology qualitative assessment for Malaysia

    International Nuclear Information System (INIS)

    Shamsul Amri Sulaiman

    2011-01-01

    Since the worlds first nuclear reactor major breakthrough in December 02, 1942, the nuclear power industry has undergone tremendous development and evolution for more than half a century. After surpassing moratorium of nuclear power plant construction caused by catastrophic accidents at Three-mile island (1979) and Chernobyl (1986), today, nuclear energy is back on the policy agendas of many countries, both developed and developing, signaling nuclear revival or nuclear renaissance. Selection of suitable nuclear power technology has thus been subjected to primary attention. This short paper attempts to draw preliminary technology assessment for the first nuclear power reactor technology for Malaysia. Methodology employed is qualitative analysis collating recent finding of tnb-kepco preliminary feasibility study for nuclear power program in peninsular malaysia and other published presentations and/or papers by multiple experts. The results suggested that pressurized water reactor (PWR) is the prevailing technology in terms of numbers and plant performances, and while the commercialization of generation IV reactors is remote (e.g. Not until 2030), generation III/ III+ NPP models are commercially available on the market today. Five (5) major steps involved in reactor technology selection were introduced with a focus on introducing important aspects of selection criteria. Three (3) categories for the of reactor technology selection were used for the cursory evaluation. The outcome of these analyses shall lead to deeper and full analyses of the recommended reactor technologies for a comprehensive feasibility study in the near future. Recommendations for reactor technology option were also provided for both strategic and technical recommendations. The paper shall also implore the best way to select systematically the first civilian nuclear power reactor. (Author)

  16. Statistical separability and the impossibility of the superluminal quantum communication

    International Nuclear Information System (INIS)

    Zhang Qiren

    2004-01-01

    The authors analyse the relation and the difference between the quantum correlation of two points in space and the communication between them. The statistical separability of two points in the space is defined and proven. From this statistical separability, authors prove that the superluminal quantum communication between different points is impossible. To emphasis the compatibility between the quantum theory and the relativity, authors write the von Neumann equation of density operator evolution in the multi-time form. (author)

  17. Polygenic scores via penalized regression on summary statistics.

    Science.gov (United States)

    Mak, Timothy Shin Heng; Porsch, Robert Milan; Choi, Shing Wan; Zhou, Xueya; Sham, Pak Chung

    2017-09-01

    Polygenic scores (PGS) summarize the genetic contribution of a person's genotype to a disease or phenotype. They can be used to group participants into different risk categories for diseases, and are also used as covariates in epidemiological analyses. A number of possible ways of calculating PGS have been proposed, and recently there is much interest in methods that incorporate information available in published summary statistics. As there is no inherent information on linkage disequilibrium (LD) in summary statistics, a pertinent question is how we can use LD information available elsewhere to supplement such analyses. To answer this question, we propose a method for constructing PGS using summary statistics and a reference panel in a penalized regression framework, which we call lassosum. We also propose a general method for choosing the value of the tuning parameter in the absence of validation data. In our simulations, we showed that pseudovalidation often resulted in prediction accuracy that is comparable to using a dataset with validation phenotype and was clearly superior to the conservative option of setting the tuning parameter of lassosum to its lowest value. We also showed that lassosum achieved better prediction accuracy than simple clumping and P-value thresholding in almost all scenarios. It was also substantially faster and more accurate than the recently proposed LDpred. © 2017 WILEY PERIODICALS, INC.

  18. Image statistics and nonlinear artifacts in composed transmission x-ray tomography

    International Nuclear Information System (INIS)

    Duerinckx, A.J.G.

    1979-01-01

    Knowledge of the image quality and image statistics in Computed Tomography (CT) images obtained with transmission x-ray CT scanners can increase the amount of clinically useful information that can be retrieved. Artifacts caused by nonlinear shadows are strongly object-dependent and are visible over larger areas of the image. No simple technique exists for their complete elimination. One source of artifacts in the first order statistics is the nonlinearities in the measured shadow or projection data used to reconstruct the image. One of the leading causes is the polychromaticity of the x-ray beam used in transmission CT scanners. Ways to improve the resulting image quality and techniques to extract additional information using dual energy scanning are discussed. A unique formalism consisting of a vector representation of the material dependence of the photon-tissue interactions is generalized to allow an in depth analysis. Poly-correction algorithms are compared using this analytic approach. Both quantum and detector electronic noise decrease the quality or information content of first order statistics. Preliminary results are presented using an heuristic adaptive nonlinear noise filter system for projection data. This filter system can be improved and/or modified to remove artifacts in both first and second order image statistics. Artifacts in the second order image statistics arise from the contribution of quantum noise. This can be described with a nonlinear detection equivalent model, similar to the model used to study artifacts in first order statistics. When analyzing these artifacts in second order statistics, one can divide them into linear artifacts, which do not present any problem of interpretation, and nonlinear artifacts, referred to as noise artifacts. A study of noise artifacts is presented together with a discussion of their relative importance in diagnostic radiology

  19. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992

    International Nuclear Information System (INIS)

    1992-12-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume, Volume 2, contains the technical basis for the 1992 PA. Specifically, it describes the conceptual basis for consequence modeling and the PA methodology, including the selection of scenarios for analysis, the determination of scenario probabilities, and the estimation of scenario consequences using a Monte Carlo technique and a linked system of computational models. Additional information about the 1992 PA is provided in other volumes. Volume I contains an overview of WIPP PA and results of a preliminary comparison with the long-term requirements of the EPA's Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses related to the preliminary comparison with 40 CFR 191B. Volume 5 contains uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance. Finally, guidance derived from the entire 1992 PA is presented in Volume 6

  20. Design update, thermal and fluid dynamic analyses of the EU-HCPB TBM in vertical arrangement

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Ilic, M.; Legradi, G.; Kiss, B.; Bitz, O.; Dolensky, B.; Neuberger, H.; Boccaccini, L.V.; Ihli, T.

    2009-01-01

    In the frame of the activities of the EU Breeder Blanket Programme and of the Test Blanket Working Group of ITER, the Helium Cooled Pebble Bed Test Blanket Module (HCPB TBM) is developed in Forschungszentrum Karlsruhe (FZK) to investigate DEMO relevant concepts for blanket modules. The three main functions of a blanket module (removing heat, breeding tritium and shielding sensitive components from radiation) will be tested in ITER using a series of four TBMs, which are irradiated successively during different test campaigns. Each HCPB TBM will be installed, with a vertical orientation, into the vacuum vessel connected to one equatorial port. As the studies performed up to 2006 in FZK concerned a horizontal orientation of the HCPB TBM, a global review of the design is necessary to match with the new ITER specifications. A preliminary version of the new vertical design is proposed extrapolating the neutronic analysis performed for the horizontal HCPB TBM. An overview of the new HCPB TBM vertical designs, as well as the preliminary thermal and fluid dynamic analyses performed for the validation of the design, are presented in this paper. A critical review of the results obtained allows us, in the conclusion, to prepare a plan for the future detailed analyses of the vertical HCPB TBM.

  1. Using GAIDA (Guide to AI Data Analysis) to analyze data collected from artificial insemination programmes for cattle in developing countries

    International Nuclear Information System (INIS)

    Goodger, W.J.; Clayton, M.; Bennett, T.; Eisele, C.; Garcia, M.; Perera, B.M.A.O.

    2001-01-01

    The objectives of AIDA (Artificial Insemination Database Application) and its companion GAIDA (Guide to AI Data Analysis) are to address two major problems in on-farm research on livestock production. The first is the quality of the data collected and the second is the intellectual rigor of the analyses and their associated results when statistically testing causal hypotheses. The solution is to develop a data management system such as AIDA and an analysis system such as GAIDA to estimate parameters that explain biological mechanisms for on-farm application. The system uses epidemiological study designs in the uncontrolled research environment of the farm, uses a database manager (Microsoft Access) to handle data management issues encountered in preparing data for analysis, and then uses a statistical program (SYSTAT) to do preliminary analyses. These analyses enable the researcher to have better understanding of the biological mechanisms involved in the data contained within the AIDA database. Using GAIDA as a guide, this preliminary analysis helps to determine the strategy for further in-depth analyses. (author)

  2. Global Micro- and Macro-structural White Matter Alterations and the reward circuit in First-episode Antipsychotic-naïve Schizophrenia Patients. Preliminary Results

    DEFF Research Database (Denmark)

    Raghava, Jayachandra Mitta; Ebdrup, Bjørn Hylsebeck; Nielsen, Mette Ødegaard

    hypothalamus separately. All tracking was performed using PROBTRACKX. Results Baseline: Voxel-wise statistics performed on the probabilistic WM fiber maps for whole brain identified deficiencies in patients, showing reduced macro-structural connectivity in: thalamus, putamen, pallidum, hippocampus, forceps......-lwise statistics performed on the probabilistic WM fiber maps for whole brain showed significant interaction, indicating improvement in the connectivity within the patients in Forceps major, right Cingulum (hippocampus), right Anterior thalamic radiation. Conclusions These preliminary results indicate...

  3. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  4. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  5. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  6. The Adolescent HIV Disclosure Cognition and Affect Scale: Preliminary Reliability and Validity.

    Science.gov (United States)

    Evangeli, Michael

    2017-07-01

    Globally, there are 2 million HIV-positive 10-19-year-olds. One challenge for this population is sharing their HIV status with others (onward HIV disclosure). There are no multi-item, multidimensional scales of HIV disclosure cognitions and affect for young people living with HIV. An 18-item measure of HIV disclosure cognition and affect was developed, administered to 65 adolescents living with HIV (aged 12-16 years). Data were explored using principal component analysis and preliminary construct and criterion validity assessed. Three factors were revealed: negative disclosure attitudes and feelings, self-efficacy, and positive disclosure attitudes and feelings. The full scale and its subscales were internally consistent. The total score showed statistically significant positive relationships with HIV disclosure in the past 6 months, HIV disclosure intention and self-perception. Preliminary evidence of the measure's good psychometric properties suggests it may be helpful in future clinical and research work. © The Author 2017. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  7. Production, crystallization and preliminary X-ray diffraction analysis of the allergen Can f 2 from Canis familiaris

    International Nuclear Information System (INIS)

    Madhurantakam, Chaithanya; Nilsson, Ola B.; Jönsson, Klas; Grönlund, Hans; Achour, Adnane

    2009-01-01

    The recombinant form of the allergen Can f 2 from C. familiaris was produced, isolated and crystallized in two different forms. Preliminary X-ray diffraction analyses are reported for the two crystal forms of Can f 2. The allergen Can f 2 from dog (Canis familiaris) present in saliva, dander and fur is an important cause of allergic sensitization worldwide. Here, the production, isolation, crystallization and preliminary X-ray diffraction analysis of two crystal forms of recombinant Can f 2 are reported. The first crystal form belonged to space group C222, with unit-cell parameters a = 68.7, b = 77.3, c = 65.1 Å, and diffracted to 1.55 Å resolution, while the second crystal form belonged to space group C2, with unit-cell parameters a = 75.7, b = 48.3, c = 68.7 Å, β = 126.5°, and diffracted to 2.1 Å resolution. Preliminary data analysis indicated the presence of a single molecule in the asymmetric unit for both crystal forms

  8. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  9. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  10. Diabetes mellitus: preliminary health-promotion activity based on ...

    African Journals Online (AJOL)

    2011-05-10

    May 10, 2011 ... mark-up language (XML). For the analysis of the logged. XML, an interpreter was written in the Python® programming language to convert the raw logs into tables of responses that could be analysed statistically. The pharmacy students also prepared a poster, interactive models and a bilingual English and ...

  11. Application of multivariate statistical techniques in microbial ecology.

    Science.gov (United States)

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  12. Preliminary Calculations of Bypass Flow Distribution in a Multi-Block Air Test

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Tak, Nam Il

    2011-01-01

    The development of a methodology for the bypass flow assessment in a prismatic VHTR (Very High Temperature Reactor) core has been conducted at KAERI. A preliminary estimation of variation of local bypass flow gap size between graphite blocks in the NHDD core were carried out. With the predicted gap sizes, their influence on the bypass flow distribution and the core hot spot was assessed. Due to the complexity of gap distributions, a system thermo-fluid analysis code is suggested as a tool for the core thermo-fluid analysis, the model and correlations of which should be validated. In order to generate data for validating the bypass flow analysis model, an experimental facility for a multi-block air test was constructed at Seoul National University (SNU). This study is focused on the preliminary evaluation of flow distribution in the test section to understand how the flow is distributed and to help the selection of experimental case. A commercial CFD code, ANSYS CFX is used for the analyses

  13. Preliminary structural assessment of DEMO vacuum vessel against a vertical displacement event

    International Nuclear Information System (INIS)

    Mozzillo, Rocco; Tarallo, Andrea; Marzullo, Domenico; Bachmann, Christian; Di Gironimo, Giuseppe; Mazzone, Giuseppe

    2016-01-01

    Highlights: • The paper focuses on a preliminary structural analysis of the current concept design of DEMO vacuum vessel. • The Vacuum Vessel was checked against the VDE in combinations with the weight force of all components that the vessel shall bear. • Different configurations for the vacuum vessel supports are considered, showing that the best solution is VV supported at the lower port. • The analyses evaluated the “P damage” according to RCC-MRx code. - Abstract: This paper focuses on a preliminary structural analysis of the current concept design of DEMO vacuum vessel (VV). The VV structure is checked against a vertical load due to a Vertical Displacement Event in combination with the weight force of all components that the main vessel shall bear. Different configurations for the supports are considered. Results show that the greatest safety margins are reached when the tokamak is supported through the lower ports rather than the equatorial ports, though all analyzed configurations are compliant with RCC-MRx design rules.

  14. Preliminary structural assessment of DEMO vacuum vessel against a vertical displacement event

    Energy Technology Data Exchange (ETDEWEB)

    Mozzillo, Rocco, E-mail: rocco.mozzillo@unina.it [CREATE, University of Naples Federico II, DII, P.le Tecchio 80, 80125, Naples (Italy); Tarallo, Andrea; Marzullo, Domenico [CREATE, University of Naples Federico II, DII, P.le Tecchio 80, 80125, Naples (Italy); Bachmann, Christian [EUROfusion PMU, Boltzmannstraße 2, 85748 Garching (Germany); Di Gironimo, Giuseppe [CREATE, University of Naples Federico II, DII, P.le Tecchio 80, 80125, Naples (Italy); Mazzone, Giuseppe [Unità Tecnica Fusione - ENEA C.R. Frascati, Via E. Fermi 45, 00044 Frascati (Italy)

    2016-11-15

    Highlights: • The paper focuses on a preliminary structural analysis of the current concept design of DEMO vacuum vessel. • The Vacuum Vessel was checked against the VDE in combinations with the weight force of all components that the vessel shall bear. • Different configurations for the vacuum vessel supports are considered, showing that the best solution is VV supported at the lower port. • The analyses evaluated the “P damage” according to RCC-MRx code. - Abstract: This paper focuses on a preliminary structural analysis of the current concept design of DEMO vacuum vessel (VV). The VV structure is checked against a vertical load due to a Vertical Displacement Event in combination with the weight force of all components that the main vessel shall bear. Different configurations for the supports are considered. Results show that the greatest safety margins are reached when the tokamak is supported through the lower ports rather than the equatorial ports, though all analyzed configurations are compliant with RCC-MRx design rules.

  15. Transumanesimo, enhancement ed evoluzione. Riflessioni preliminari per una ridefinizione critica

    Directory of Open Access Journals (Sweden)

    LO SAPIO, LUCA

    2016-12-01

    Full Text Available Transhumanism, enhancement and evolution. Preliminary reflections for a critical redefinition This paper aims at focusing some possible uses of evolutionary theory in transhumanist authors. In particular it is analysed the evolutionary heuristics discussed by Nick Bostrom and Anders Sandberg, the proposal of a moral bioenhancement in Julian Savulescu and Ingmar Persson and the idea itself of a self-direct evolution based on the assumption of a self-guided selection of men’s features. These ideas are discussed to bring out the critical elements which are present in some uses of a presumably Darwinian framework in transhumanist approach.

  16. Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.

    Science.gov (United States)

    Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi

    2013-12-01

    Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.

  17. A preliminary study for investigating idiopatic normal pressure hydrocephalus by means of statistical parameters classification of intracranial pressure recordings.

    Science.gov (United States)

    Calisto, A; Bramanti, A; Galeano, M; Angileri, F; Campobello, G; Serrano, S; Azzerboni, B

    2009-01-01

    The objective of this study is to investigate Id-iopatic Normal Pressure Hydrocephalus (INPH) through a multidimensional and multiparameter analysis of statistical data obtained from accurate analysis of Intracranial Pressure (ICP) recordings. Such a study could permit to detect new factors, correlated with therapeutic response, which are able to validate a predicting significance for infusion test. The algorithm developed by the authors computes 13 ICP parameter trends on each of the recording, afterward 9 statistical information from each trend is determined. All data are transferred to the datamining software WEKA. According to the exploited feature-selection techniques, the WEKA has revealed that the most significant statistical parameter is the maximum of Single-Wave-Amplitude: setting a 27 mmHg threshold leads to over 90% of correct classification.

  18. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    Science.gov (United States)

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  19. International Conference on Mathematical Sciences and Statistics 2013 : Selected Papers

    CERN Document Server

    Leong, Wah; Eshkuvatov, Zainidin

    2014-01-01

    This volume is devoted to the most recent discoveries in mathematics and statistics. It also serves as a platform for knowledge and information exchange between experts from industrial and academic sectors. The book covers a wide range of topics, including mathematical analyses, probability, statistics, algebra, geometry, mathematical physics, wave propagation, stochastic processes, ordinary and partial differential equations, boundary value problems, linear operators, cybernetics and number and functional theory. It is a valuable resource for pure and applied mathematicians, statisticians, engineers and scientists.

  20. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Preliminary environmental analysis of a geopressured-geothermal test well in Brazoria County, Texas

    Energy Technology Data Exchange (ETDEWEB)

    White, W.A.; McGraw, M.; Gustavson, T.C.; Meriwether, J.

    1977-11-16

    Preliminary environmental data, including current land use, substrate lithology, soils, natural hazards, water resources, biological assemblages, meteorological data, and regulatory considerations have been collected and analyzed for approximately 150 km/sup 2/ of land near Chocolate Bayou, Brazoria County, Texas, in which a geopressured-geothermal test well is to be drilled in the fall of 1977. The study was designed to establish an environmental data base and to determine, within spatial constraints set by subsurface reservoir conditions, environmentally suitable sites for the proposed well. Preliminary analyses of data revealed the eed for focusing on the following areas: potential for subsidence and fault activation, susceptibility of test well and support facilities to fresh- and salt-water flooding, possible effects of produced saline waters on biological assemblages and groundwaer resources, distribution of expansive soils, and effect of drilling and associated support activities on known archeological-cultural resources.

  2. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  3. Statistical model of fractures and deformations zones for Forsmark. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, Paul R. [Golder Associate Inc., Redmond, WA (United States); Olofsson, Isabelle; Hermanson, Jan [Golder Associates AB, Uppsala (Sweden)

    2005-04-01

    Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the

  4. Statistical model of fractures and deformations zones for Forsmark. Preliminary site description Forsmark area - version 1.2

    International Nuclear Information System (INIS)

    La Pointe, Paul R.; Olofsson, Isabelle; Hermanson, Jan

    2005-04-01

    Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the

  5. Selected problems and results of the transient event and reliability analyses for the German safety study

    International Nuclear Information System (INIS)

    Hoertner, H.

    1977-01-01

    For the investigation of the risk of nuclear power plants loss-of-coolant accidents and transients have to be analyzed. The different functions of the engineered safety features installed to cope with transients are explained. The event tree analysis is carried out for the important transient 'loss of normal onsite power'. Preliminary results of the reliability analyses performed for quantitative evaluation of this event tree are shown. (orig.) [de

  6. Securing cooperation from persons supplying statistical data.

    Science.gov (United States)

    AUBENQUE, M J; BLAIKLEY, R M; HARRIS, F F; LAL, R B; NEURDENBURG, M G; DE SHELLY HERNANDEZ, R

    1954-01-01

    Securing the co-operation of persons supplying information required for medical statistics is essentially a problem in human relations, and an understanding of the motivations, attitudes, and behaviour of the respondents is necessary.Before any new statistical survey is undertaken, it is suggested by Aubenque and Harris that a preliminary review be made so that the maximum use is made of existing information. Care should also be taken not to burden respondents with an overloaded questionnaire. Aubenque and Harris recommend simplified reporting. Complete population coverage is not necessary.Neurdenburg suggests that the co-operation and support of such organizations as medical associations and social security boards are important and that propaganda should be directed specifically to the groups whose co-operation is sought. Informal personal contacts are valuable and desirable, according to Blaikley, but may have adverse effects if the right kind of approach is not made.Financial payments as an incentive in securing co-operation are opposed by Neurdenburg, who proposes that only postage-free envelopes or similar small favours be granted. Blaikley and Harris, on the other hand, express the view that financial incentives may do much to gain the support of those required to furnish data; there are, however, other incentives, and full use should be made of the natural inclinations of respondents. Compulsion may be necessary in certain instances, but administrative rather than statutory measures should be adopted. Penalties, according to Aubenque, should be inflicted only when justified by imperative health requirements.The results of surveys should be made available as soon as possible to those who co-operated, and Aubenque and Harris point out that they should also be of practical value to the suppliers of the information.Greater co-operation can be secured from medical persons who have an understanding of the statistical principles involved; Aubenque and Neurdenburg

  7. Analyses of statistical transformations of row data describing free proline concentration in sugar beet exposed to drought

    Directory of Open Access Journals (Sweden)

    Putnik-Delić Marina I.

    2010-01-01

    Full Text Available Eleven sugar beet genotypes were tested for their capacity to tolerate drought. Plants were grown in semi-controlled conditions, in the greenhouse, and watered daily. After 90 days, water deficit was imposed by the cessation of watering, while the control plants continued to be watered up to 80% of FWC. Five days later concentration of free proline in leaves was determined. Analysis was done in three replications. Statistical analysis was performed using STATISTICA 9.0, Minitab 15, and R2.11.1. Differences between genotypes were statistically processed by Duncan test. Because of nonormality of the data distribution and heterogeneity of variances in different groups, two types of transformations of row data were applied. For this type of data more appropriate in eliminating nonormality was Johnson transformation, as opposed to Box-Cox. Based on the both transformations it may be concluded that in all genotypes except for 10, concentration of free proline differs significantly between treatment (drought and the control.

  8. Statistical evaluation of cleanup: How should it be done?

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1993-02-01

    This paper discusses statistical issues that must be addressed when conducting statistical tests for the purpose of evaluating if a site has been remediated to guideline values or standards. The importance of using the Data Quality Objectives (DQO) process to plan and design the sampling plan is emphasized. Other topics discussed are: (1) accounting for the uncertainty of cleanup standards when conducting statistical tests, (2) determining the number of samples and measurements needed to attain specified DQOs, (3) considering whether the appropriate testing philosophy in a given situation is ''guilty until proven innocent'' or ''innocent until proven guilty'' when selecting a statistical test for evaluating the attainment of standards, (4) conducting tests using data sets that contain measurements that have been reported by the laboratory as less than the minimum detectable activity, and (5) selecting statistical tests that are appropriate for risk-based or background-based standards. A recent draft report by Berger that provides guidance on sampling plans and data analyses for final status surveys at US Nuclear Regulatory Commission licensed facilities serves as a focal point for discussion

  9. A closer look at the effect of preliminary goodness-of-fit testing for normality for the one-sample t-test.

    Science.gov (United States)

    Rochon, Justine; Kieser, Meinhard

    2011-11-01

    Student's one-sample t-test is a commonly used method when inference about the population mean is made. As advocated in textbooks and articles, the assumption of normality is often checked by a preliminary goodness-of-fit (GOF) test. In a paper recently published by Schucany and Ng it was shown that, for the uniform distribution, screening of samples by a pretest for normality leads to a more conservative conditional Type I error rate than application of the one-sample t-test without preliminary GOF test. In contrast, for the exponential distribution, the conditional level is even more elevated than the Type I error rate of the t-test without pretest. We examine the reasons behind these characteristics. In a simulation study, samples drawn from the exponential, lognormal, uniform, Student's t-distribution with 2 degrees of freedom (t(2) ) and the standard normal distribution that had passed normality screening, as well as the ingredients of the test statistics calculated from these samples, are investigated. For non-normal distributions, we found that preliminary testing for normality may change the distribution of means and standard deviations of the selected samples as well as the correlation between them (if the underlying distribution is non-symmetric), thus leading to altered distributions of the resulting test statistics. It is shown that for skewed distributions the excess in Type I error rate may be even more pronounced when testing one-sided hypotheses. ©2010 The British Psychological Society.

  10. 28 CFR 2.48 - Revocation: Preliminary interview.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Revocation: Preliminary interview. 2.48....48 Revocation: Preliminary interview. (a) Interviewing officer. A parolee who is retaken on a warrant issued by a Commissioner shall be given a preliminary interview by an official designated by the Regional...

  11. Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

    Energy Technology Data Exchange (ETDEWEB)

    Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J

    2007-10-24

    Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples

  12. Statistical analysis of brake squeal noise

    Science.gov (United States)

    Oberst, S.; Lai, J. C. S.

    2011-06-01

    Despite substantial research efforts applied to the prediction of brake squeal noise since the early 20th century, the mechanisms behind its generation are still not fully understood. Squealing brakes are of significant concern to the automobile industry, mainly because of the costs associated with warranty claims. In order to remedy the problems inherent in designing quieter brakes and, therefore, to understand the mechanisms, a design of experiments study, using a noise dynamometer, was performed by a brake system manufacturer to determine the influence of geometrical parameters (namely, the number and location of slots) of brake pads on brake squeal noise. The experimental results were evaluated with a noise index and ranked for warm and cold brake stops. These data are analysed here using statistical descriptors based on population distributions, and a correlation analysis, to gain greater insight into the functional dependency between the time-averaged friction coefficient as the input and the peak sound pressure level data as the output quantity. The correlation analysis between the time-averaged friction coefficient and peak sound pressure data is performed by applying a semblance analysis and a joint recurrence quantification analysis. Linear measures are compared with complexity measures (nonlinear) based on statistics from the underlying joint recurrence plots. Results show that linear measures cannot be used to rank the noise performance of the four test pad configurations. On the other hand, the ranking of the noise performance of the test pad configurations based on the noise index agrees with that based on nonlinear measures: the higher the nonlinearity between the time-averaged friction coefficient and peak sound pressure, the worse the squeal. These results highlight the nonlinear character of brake squeal and indicate the potential of using nonlinear statistical analysis tools to analyse disc brake squeal.

  13. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  14. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  15. Preliminary Design Through Graphs: A Tool for Automatic Layout Distribution

    Directory of Open Access Journals (Sweden)

    Carlo Biagini

    2015-02-01

    Full Text Available Diagrams are essential in the preliminary stages of design for understanding distributive aspects and assisting the decision-making process. By drawing a schematic graph, designers can visualize in a synthetic way the relationships between many aspects: functions and spaces, distribution of layouts, space adjacency, influence of traffic flows within a facility layout, and so on. This process can be automated through the use of modern Information and Communication Technologies tools (ICT that allow the designers to manage a large quantity of information. The work that we will present is part of an on-going research project into how modern parametric software influences decision-making on the basis of automatic and optimized layout distribution. The method involves two phases: the first aims to define the ontological relation between spaces, with particular reference to a specific building typology (rules of aggregation of spaces; the second entails the implementation of these rules through the use of specialist software. The generation of ontological relations begins with the collection of data from historical manuals and analyses of case studies. These analyses aim to generate a “relationship matrix” based on preferences of space adjacency. The phase of implementing the previously defined rules is based on the use of Grasshopper to analyse and visualize different layout configurations. The layout is generated by simulating a process involving the collision of spheres, which represents specific functions of the design program. The spheres are attracted or rejected as a function of the relationships matrix, as defined above. The layout thus obtained will remain in a sort of abstract state independent of information about the exterior form, but will still provide a useful tool for the decision-making process. In addition, preliminary results gathered through the analysis of case studies will be presented. These results provide a good variety

  16. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    Science.gov (United States)

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  17. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    Science.gov (United States)

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  18. Statistical Data Editing in Scientific Articles.

    Science.gov (United States)

    Habibzadeh, Farrokh

    2017-07-01

    Scientific journals are important scholarly forums for sharing research findings. Editors have important roles in safeguarding standards of scientific publication and should be familiar with correct presentation of results, among other core competencies. Editors do not have access to the raw data and should thus rely on clues in the submitted manuscripts. To identify probable errors, they should look for inconsistencies in presented results. Common statistical problems that can be picked up by a knowledgeable manuscript editor are discussed in this article. Manuscripts should contain a detailed section on statistical analyses of the data. Numbers should be reported with appropriate precisions. Standard error of the mean (SEM) should not be reported as an index of data dispersion. Mean (standard deviation [SD]) and median (interquartile range [IQR]) should be used for description of normally and non-normally distributed data, respectively. If possible, it is better to report 95% confidence interval (CI) for statistics, at least for main outcome variables. And, P values should be presented, and interpreted with caution, if there is a hypothesis. To advance knowledge and skills of their members, associations of journal editors are better to develop training courses on basic statistics and research methodology for non-experts. This would in turn improve research reporting and safeguard the body of scientific evidence. © 2017 The Korean Academy of Medical Sciences.

  19. Mineralogical test as a preliminary step for metallurgical proses of Kalan ores

    International Nuclear Information System (INIS)

    Affandi, K.

    1998-01-01

    Mineralogical tests as a preliminary step for hydrometallurgy of Kalan ores, including Eko Remaja and Rirang have been carried out to identify the elements and minerals content which affect the metallurgical process, especially the leaching and purification of uranium. Mineralogical tests have been done by means of radioactive and radioluxugraph tests to identify radioactive minerals; thin specimen analysis, Scanning Electron Microscopy (SEM) to identify elements and morphology, EPMA to analyse qualitatively the elements, X-ray Diffractometer (XRD) to identify of minerals content; and X-ray Fluorescence (XRF) and chemical analyses to determine total elements qualitatively and quantitatively. The experimental results show that the Eko Remaja ores contain uraninite and brannerite, iron and titan oxides, sulfides, phosphates and silicates minerals, while the Rirang ores contain uraninite, monazite and molybdenite

  20. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  1. Anti-schistosomal intervention targets identified by lifecycle transcriptomic analyses.

    Directory of Open Access Journals (Sweden)

    Jennifer M Fitzpatrick

    2009-11-01

    Full Text Available Novel methods to identify anthelmintic drug and vaccine targets are urgently needed, especially for those parasite species currently being controlled by singular, often limited strategies. A clearer understanding of the transcriptional components underpinning helminth development will enable identification of exploitable molecules essential for successful parasite/host interactions. Towards this end, we present a combinatorial, bioinformatics-led approach, employing both statistical and network analyses of transcriptomic data, for identifying new immunoprophylactic and therapeutic lead targets to combat schistosomiasis.Utilisation of a Schistosoma mansoni oligonucleotide DNA microarray consisting of 37,632 elements enabled gene expression profiling from 15 distinct parasite lifecycle stages, spanning three unique ecological niches. Statistical approaches of data analysis revealed differential expression of 973 gene products that minimally describe the three major characteristics of schistosome development: asexual processes within intermediate snail hosts, sexual maturation within definitive vertebrate hosts and sexual dimorphism amongst adult male and female worms. Furthermore, we identified a group of 338 constitutively expressed schistosome gene products (including 41 transcripts sharing no sequence similarity outside the Platyhelminthes, which are likely to be essential for schistosome lifecycle progression. While highly informative, statistics-led bioinformatics mining of the transcriptional dataset has limitations, including the inability to identify higher order relationships between differentially expressed transcripts and lifecycle stages. Network analysis, coupled to Gene Ontology enrichment investigations, facilitated a re-examination of the dataset and identified 387 clusters (containing 12,132 gene products displaying novel examples of developmentally regulated classes (including 294 schistosomula and/or adult transcripts with no

  2. Steam Generator Group Project. Progress report on data acquisition/statistical analysis

    International Nuclear Information System (INIS)

    Doctor, P.G.; Buchanan, J.A.; McIntyre, J.M.; Hof, P.J.; Ercanbrack, S.S.

    1984-01-01

    A major task of the Steam Generator Group Project (SGGP) is to establish the reliability of the eddy current inservice inspections of PWR steam generator tubing, by comparing the eddy current data to the actual physical condition of the tubes via destructive analyses. This report describes the plans for the computer systems needed to acquire, store and analyze the diverse data to be collected during the project. The real-time acquisition of the baseline eddy current inspection data will be handled using a specially designed data acquisition computer system based on a Digital Equipment Corporation (DEC) PDP-11/44. The data will be archived in digital form for use after the project is completed. Data base management and statistical analyses will be done on a DEC VAX-11/780. Color graphics will be heavily used to summarize the data and the results of the analyses. The report describes the data that will be taken during the project and the statistical methods that will be used to analyze the data. 7 figures, 2 tables

  3. Statistics and Dynamics in the Large-scale Structure of the Universe

    International Nuclear Information System (INIS)

    Matsubara, Takahiko

    2006-01-01

    In cosmology, observations and theories are related to each other by statistics in most cases. Especially, statistical methods play central roles in analyzing fluctuations in the universe, which are seeds of the present structure of the universe. The confrontation of the statistics and dynamics is one of the key methods to unveil the structure and evolution of the universe. I will review some of the major statistical methods in cosmology, in connection with linear and nonlinear dynamics of the large-scale structure of the universe. The present status of analyses of the observational data such as the Sloan Digital Sky Survey, and the future prospects to constrain the nature of exotic components of the universe such as the dark energy will be presented

  4. On Preliminary Test Estimator for Median

    OpenAIRE

    Okazaki, Takeo; 岡崎, 威生

    1990-01-01

    The purpose of the present paper is to discuss about estimation of median with a preliminary test. Two procedures are presented, one uses Median test and the other uses Wilcoxon two-sample test for the preliminary test. Sections 3 and 4 give mathematical formulations of such properties, including mean square errors with one specified case. Section 5 discusses their optimal significance levels of the preliminary test and proposes their numerical values by Monte Carlo method. In addition to mea...

  5. 47th Scientific Meeting of the Italian Statistical Society

    CERN Document Server

    Moreno, Elías; Racugno, Walter

    2016-01-01

    This book brings together selected peer-reviewed contributions from various research fields in statistics, and highlights the diverse approaches and analyses related to real-life phenomena. Major topics covered in this volume include, but are not limited to, bayesian inference, likelihood approach, pseudo-likelihoods, regression, time series, and data analysis as well as applications in the life and social sciences. The software packages used in the papers are made available by the authors. This book is a result of the 47th Scientific Meeting of the Italian Statistical Society, held at the University of Cagliari, Italy, in 2014.

  6. The statistical chopper in the time-of-flight technique

    International Nuclear Information System (INIS)

    Albuquerque Vieira, J. de.

    1975-12-01

    A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt

  7. Excel 2016 for engineering statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching engineering statistics effectively. Similar to the previously published Excel 2013 for Engineering Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However,Excel 2016 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and...

  8. Excel 2016 for business statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching business statistics effectively. Similar to the previously published Excel 2010 for Business Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Business Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each ch...

  9. Excel 2016 for marketing statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This is the first book to show the capabilities of Microsoft Excel in teaching marketing statistics effectively. It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical marketing problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in marketing courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Marketing Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader t...

  10. Excel 2013 for engineering statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2015-01-01

    This is the first book to show the capabilities of Microsoft Excel to teach engineering statistics effectively.  It is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems.  If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses.  Its powerful computational ability and graphical functions make learning statistics much easier than in years past.  However, Excel 2013 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs...

  11. A preliminary census of engineering activities located in Sicily (Southern Italy) which may "potentially" induce seismicity

    Science.gov (United States)

    Aloisi, Marco; Briffa, Emanuela; Cannata, Andrea; Cannavò, Flavio; Gambino, Salvatore; Maiolino, Vincenza; Maugeri, Roberto; Palano, Mimmo; Privitera, Eugenio; Scaltrito, Antonio; Spampinato, Salvatore; Ursino, Andrea; Velardita, Rosanna

    2015-04-01

    of instrumental and historical seismicity, focal mechanisms solutions, multidisciplinary stress indicators, GPS-based ground deformation field, mapped faults, etc by merging data from on-line catalogues with those reported in literature. Finally, for each individual site, we analysed: i) long-term statistic behaviour of instrumental seismicity (magnitude of completeness, seismic release above a threshold magnitude, depth distribution, focal plane solutions); ii) long-term statistic behaviour of historical seismicity (maximum magnitude estimation, recurrence time interval, etc); iii) properties and orientation of faults (length, estimated geological slip, kinematics, etc); iv) regional stress (from borehole, seismological and geological observations) and strain (from GPS-based observations) fields.

  12. An Item Fit Statistic Based on Pseudocounts from the Generalized Graded Unfolding Model: A Preliminary Report.

    Science.gov (United States)

    Roberts, James S.

    Stone and colleagues (C. Stone, R. Ankenman, S. Lane, and M. Liu, 1993; C. Stone, R. Mislevy and J. Mazzeo, 1994; C. Stone, 2000) have proposed a fit index that explicitly accounts for the measurement error inherent in an estimated theta value, here called chi squared superscript 2, subscript i*. The elements of this statistic are natural…

  13. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  14. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  15. Review of recent ORNL specific-plant analyses

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Dickson, T.L.

    1991-01-01

    The Oak Ridge National Laboratory (ORNL) has been helping the Nuclear Regulatory Commission (NRC) develop the pressurized thermal shock (PTS) evaluation methodology since the mid-1970s. During the early 1980s, ORNL developed the integrated PTS (IPTS) methodology, which is a probabilistic approach that includes postulation of PTS transients, estimation of their frequencies, thermal/hydraulic analyses to obtain the corresponding thermal and pressure loadings on the reactor pressure vessel, and probabilistic fracture mechanics analyses. The scope of the IPTS program included development of the probabilistic fracture mechanics code OCA-P and application of the IPTS methodology to three nuclear plants in the US (Oconee I, Calvert Cliffs I, and H. B. Robinson II). The results of this effort were used to help establish the PTS Rule (10CFR50.61) and Regulatory Guide 1.154, which pertains to the PTS issue. The IPTS Program was completed in 1985, and since that time the ORNL related effort has been associated with long-term programs aimed at improving/updating the probabilistic fracture mechanics methodology and input data. In 1990, the NRC requested that ORNL review a vessel-integrity evaluation report submitted to the NRC by the Yankee Atomic Electric Co. for the Yankee Rowe reactor and that ORNL also perform an independent probabilistic fracture mechanics analysis. Details of the methodology and preliminary results are the subject of this paper/presentation

  16. Environmental restoration and statistics: Issues and needs

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1991-10-01

    Statisticians have a vital role to play in environmental restoration (ER) activities. One facet of that role is to point out where additional work is needed to develop statistical sampling plans and data analyses that meet the needs of ER. This paper is an attempt to show where statistics fits into the ER process. The statistician, as member of the ER planning team, works collaboratively with the team to develop the site characterization sampling design, so that data of the quality and quantity required by the specified data quality objectives (DQOs) are obtained. At the same time, the statistician works with the rest of the planning team to design and implement, when appropriate, the observational approach to streamline the ER process and reduce costs. The statistician will also provide the expertise needed to select or develop appropriate tools for statistical analysis that are suited for problems that are common to waste-site data. These data problems include highly heterogeneous waste forms, large variability in concentrations over space, correlated data, data that do not have a normal (Gaussian) distribution, and measurements below detection limits. Other problems include environmental transport and risk models that yield highly uncertain predictions, and the need to effectively communicate to the public highly technical information, such as sampling plans, site characterization data, statistical analysis results, and risk estimates. Even though some statistical analysis methods are available ''off the shelf'' for use in ER, these problems require the development of additional statistical tools, as discussed in this paper. 29 refs

  17. Ayahuasca-assisted therapy for addiction: results from a preliminary observational study in Canada.

    Science.gov (United States)

    Thomas, Gerald; Lucas, Philippe; Capler, N Rielle; Tupper, Kenneth W; Martin, Gina

    2013-03-01

    This paper reports results from a preliminary observational study of ayahuasca-assisted treatment for problematic substance use and stress delivered in a rural First Nations community in British Columbia, Canada. The "Working with Addiction and Stress" retreats combined four days of group counselling with two expert-led ayahuasca ceremonies. This study collected pre-treatment and six months follow-up data from 12 participants on several psychological and behavioral factors related to problematic substance use, and qualitative data assessing the personal experiences of the participants six months after the retreat. Statistically significant (p ayahuasca-assisted therapy appears to be associated with statistically significant improvements in several factors related to problematic substance use among a rural aboriginal population. These findings suggest participants may have experienced positive psychological and behavioral changes in response to this therapeutic approach, and that more rigorous research of ayahuasca-assisted therapy for problematic substance use is warranted.

  18. Preliminary assessment of the gender aspects of disaster vulnerability and loss of human life in South Africa

    OpenAIRE

    Tandlich, Roman; Chirenda, Tatenda G; Srinivas, Chandra S S

    2013-01-01

    South Africa has reached a medium level of human development and has a heterogeneous situation with respect to disaster risk management. In this article, a preliminary assessment of the gender aspects of disaster vulnerability and fatalities is presented. The United Nations, the Health Systems Trust and Statistics South Africa were used as data sources for the following gender-segregated values: the life expectancy at birth, unemployment rates, the human development index values, the maternal...

  19. Stochastic index model for intermittent regimes: from preliminary analysis to regionalisation

    Directory of Open Access Journals (Sweden)

    M. Rianna

    2011-04-01

    Full Text Available In small and medium-sized basins or in rivers characterized by intermittent discharges, with low or negligible/null observed values for long periods of the year, the correct representation of the discharge regime is important for issues related to water management and to define the amount and quality of water available for irrigation, domestic and recreational uses. In these cases, only one index as a statistical metric is often not enough; it is thus necessary to introduce Flow Duration Curves (FDC.

    The aim of this study is therefore to combine a stochastic index flow model capable of reproducing the FDC record period of a river, regardless of the persistence and seasonality of the series, with the theory of total probability in order to calculate how often a river is dry.

    The paper draws from preliminary analyses, including a study to estimate the correlation between discharge indicators Q95, Q50 and Q1 (discharges exceeding 95%, 50% or 1% of the time, respectively and some fundamental characteristics of the basin, as well as to identify homogeneous regions in the target area through the study of several geo-morphological features and climatic conditions. The stochastic model was then applied in one of the homogeneous regions that includes intermittent rivers.

    Finally, the model was regionalized by means of regression analysis in order to calculate the FDC for ungauged basins; the reliability of this method was tested using jack-knife validation.

  20. The CSB Incident Screening Database: description, summary statistics and uses.

    Science.gov (United States)

    Gomez, Manuel R; Casper, Susan; Smith, E Allen

    2008-11-15

    This paper briefly describes the Chemical Incident Screening Database currently used by the CSB to identify and evaluate chemical incidents for possible investigations, and summarizes descriptive statistics from this database that can potentially help to estimate the number, character, and consequences of chemical incidents in the US. The report compares some of the information in the CSB database to roughly similar information available from databases operated by EPA and the Agency for Toxic Substances and Disease Registry (ATSDR), and explores the possible implications of these comparisons with regard to the dimension of the chemical incident problem. Finally, the report explores in a preliminary way whether a system modeled after the existing CSB screening database could be developed to serve as a national surveillance tool for chemical incidents.