WorldWideScience

Sample records for hybridizations statistical analyses

  1. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  2. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  3. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  4. Applied statistics a handbook of BMDP analyses

    CERN Document Server

    Snell, E J

    1987-01-01

    This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...

  5. QUANTITATIVE IMAGING AND STATISTICAL ANALYSIS OF FLUORESCENCE IN SITU HYBRIDIZATION (FISH) OF AUREOBASIDIUM PULLULANS. (R823845)

    Science.gov (United States)

    AbstractImage and multifactorial statistical analyses were used to evaluate the intensity of fluorescence signal from cells of three strains of A. pullulans and one strain of Rhodosporidium toruloides, as an outgroup, hybridized with either a universal o...

  6. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, René Rydhof

    2007-01-01

    In this paper, hybrid logic is used to formulate a rational reconstruction of a previously published control flow analysis for the mobile ambients calculus and we further show how a more precise flow-sensitive analysis, that takes the ordering of action sequences into account, can be formulated...... in a natural way. We show that hybrid logic is very well suited to express the semantic structure of the ambient calculus and how features of hybrid logic can be exploited to reduce the "administrative overhead" of the analysis specification and thus simplify it. Finally, we use HyLoTab, a fully automated...

  7. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, René Rydhof

    2007-01-01

    In this paper, hybrid logic is used to formulate a rational reconstruction of a previously published control flow analysis for the mobile ambients calculus and we further show how a more precise flow-sensitive analysis, that takes the ordering of action sequences into account, can be formulated...

  8. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  9. Study of statistical properties of hybrid statistic in coherent multi-detector compact binary coalescences Search

    OpenAIRE

    Haris, K; Pai, Archana

    2015-01-01

    In this article, we revisit the problem of coherent multi-detector search of gravitational wave from compact binary coalescence with Neutron stars and Black Holes using advanced interferometers like LIGO-Virgo. Based on the loss of optimal multi-detector signal-to-noise ratio (SNR), we construct a hybrid statistic as a best of maximum-likelihood-ratio(MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybrid statistic is studied. The performance of this ...

  10. Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA

    Science.gov (United States)

    Thorndahl, S.; Smith, J. A.; Krajewski, W. F.

    2012-04-01

    During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and

  11. Statistical analyses of conserved features of genomic islands in bacteria.

    Science.gov (United States)

    Guo, F-B; Xia, Z-K; Wei, W; Zhao, H-L

    2014-03-17

    We performed statistical analyses of five conserved features of genomic islands of bacteria. Analyses were made based on 104 known genomic islands, which were identified by comparative methods. Four of these features include sequence size, abnormal G+C content, flanking tRNA gene, and embedded mobility gene, which are frequently investigated. One relatively new feature, G+C homogeneity, was also investigated. Among the 104 known genomic islands, 88.5% were found to fall in the typical length of 10-200 kb and 80.8% had G+C deviations with absolute values larger than 2%. For the 88 genomic islands whose hosts have been sequenced and annotated, 52.3% of them were found to have flanking tRNA genes and 64.7% had embedded mobility genes. For the homogeneity feature, 85% had an h homogeneity index less than 0.1, indicating that their G+C content is relatively uniform. Taking all the five features into account, 87.5% of 88 genomic islands had three of them. Only one genomic island had only one conserved feature and none of the genomic islands had zero features. These statistical results should help to understand the general structure of known genomic islands. We found that larger genomic islands tend to have relatively small G+C deviations relative to absolute values. For example, the absolute G+C deviations of 9 genomic islands longer than 100,000 bp were all less than 5%. This is a novel but reasonable result given that larger genomic islands should have greater restrictions in their G+C contents, in order to maintain the stable G+C content of the recipient genome.

  12. Statistical reliability analyses of two wood plastic composite extrusion processes

    International Nuclear Information System (INIS)

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  13. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  14. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  15. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  16. A weighted U statistic for association analyses considering genetic heterogeneity.

    Science.gov (United States)

    Wei, Changshuai; Elston, Robert C; Lu, Qing

    2016-07-20

    Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Statistical and extra-statistical considerations in differential item functioning analyses

    Directory of Open Access Journals (Sweden)

    G. K. Huysamen

    2004-10-01

    Full Text Available This article briefly describes the main procedures for performing differential item functioning (DIF analyses and points out some of the statistical and extra-statistical implications of these methods. Research findings on the sources of DIF, including those associated with translated tests, are reviewed. As DIF analyses are oblivious of correlations between a test and relevant criteria, the elimination of differentially functioning items does not necessarily improve predictive validity or reduce any predictive bias. The implications of the results of past DIF research for test development in the multilingual and multi-cultural South African society are considered. Opsomming Hierdie artikel beskryf kortliks die hoofprosedures vir die ontleding van differensiële itemfunksionering (DIF en verwys na sommige van die statistiese en buite-statistiese implikasies van hierdie metodes. ’n Oorsig word verskaf van navorsingsbevindings oor die bronne van DIF, insluitend dié by vertaalde toetse. Omdat DIF-ontledings nie die korrelasies tussen ’n toets en relevante kriteria in ag neem nie, sal die verwydering van differensieel-funksionerende items nie noodwendig voorspellingsgeldigheid verbeter of voorspellingsydigheid verminder nie. Die implikasies van vorige DIF-navorsingsbevindings vir toetsontwikkeling in die veeltalige en multikulturele Suid-Afrikaanse gemeenskap word oorweeg.

  18. Stillwater Hybrid Geo-Solar Power Plant Optimization Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Daniel S.; Mines, Gregory L.; Turchi, Craig S.; Zhu, Guangdong; Cohan, Sander; Angelini, Lorenzo; Bizzarri, Fabrizio; Consoli, Daniele; De Marzo, Alessio

    2015-09-02

    The Stillwater Power Plant is the first hybrid plant in the world able to bring together a medium-enthalpy geothermal unit with solar thermal and solar photovoltaic systems. Solar field and power plant models have been developed to predict the performance of the Stillwater geothermal / solar-thermal hybrid power plant. The models have been validated using operational data from the Stillwater plant. A preliminary effort to optimize performance of the Stillwater hybrid plant using optical characterization of the solar field has been completed. The Stillwater solar field optical characterization involved measurement of mirror reflectance, mirror slope error, and receiver position error. The measurements indicate that the solar field may generate 9% less energy than the design value if an appropriate tracking offset is not employed. A perfect tracking offset algorithm may be able to boost the solar field performance by about 15%. The validated Stillwater hybrid plant models were used to evaluate hybrid plant operating strategies including turbine IGV position optimization, ACC fan speed and turbine IGV position optimization, turbine inlet entropy control using optimization of multiple process variables, and mixed working fluid substitution. The hybrid plant models predict that each of these operating strategies could increase net power generation relative to the baseline Stillwater hybrid plant operations.

  19. ANALYSING SOLAR-WIND HYBRID POWER GENERATING SYSTEM

    Directory of Open Access Journals (Sweden)

    Mustafa ENGİN

    2005-02-01

    Full Text Available In this paper, a solar-wind hybrid power generating, system that will be used for security lighting was designed. Hybrid system was installed and solar cells, wind turbine, battery bank, charge regulators and inverter performance values were measured through the whole year. Using measured values of overall system efficiency, reliability, demanded energy cost per kWh were calculated, and percentage of generated energy according to resources were defined. We also include in the paper a discussion of new strategies to improve hybrid power generating system performance and demanded energy cost per kWh.

  20. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    Science.gov (United States)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  1. Using statistical inference for decision making in best estimate analyses

    International Nuclear Information System (INIS)

    Sermer, P.; Weaver, K.; Hoppe, F.; Olive, C.; Quach, D.

    2008-01-01

    For broad classes of safety analysis problems, one needs to make decisions when faced with randomly varying quantities which are also subject to errors. The means for doing this involves a statistical approach which takes into account the nature of the physical problems, and the statistical constraints they impose. We describe the methodology for doing this which has been developed at Nuclear Safety Solutions, and we draw some comparisons to other methods which are commonly used in Canada and internationally. Our methodology has the advantages of being robust and accurate and compares favourably to other best estimate methods. (author)

  2. Hybrid and Parallel Domain-Decomposition Methods Development to Enable Monte Carlo for Reactor Analyses

    International Nuclear Information System (INIS)

    Wagner, John C.; Mosher, Scott W.; Evans, Thomas M.; Peplow, Douglas E.; Turner, John A.

    2010-01-01

    This paper describes code and methods development at the Oak Ridge National Laboratory focused on enabling high-fidelity, large-scale reactor analyses with Monte Carlo (MC). Current state-of-the-art tools and methods used to perform real commercial reactor analyses have several undesirable features, the most significant of which is the non-rigorous spatial decomposition scheme. Monte Carlo methods, which allow detailed and accurate modeling of the full geometry and are considered the gold standard for radiation transport solutions, are playing an ever-increasing role in correcting and/or verifying the deterministic, multi-level spatial decomposition methodology in current practice. However, the prohibitive computational requirements associated with obtaining fully converged, system-wide solutions restrict the role of MC to benchmarking deterministic results at a limited number of state-points for a limited number of relevant quantities. The goal of this research is to change this paradigm by enabling direct use of MC for full-core reactor analyses. The most significant of the many technical challenges that must be overcome are the slow, non-uniform convergence of system-wide MC estimates and the memory requirements associated with detailed solutions throughout a reactor (problems involving hundreds of millions of different material and tally regions due to fuel irradiation, temperature distributions, and the needs associated with multi-physics code coupling). To address these challenges, our research has focused on the development and implementation of (1) a novel hybrid deterministic/MC method for determining high-precision fluxes throughout the problem space in k-eigenvalue problems and (2) an efficient MC domain-decomposition (DD) algorithm that partitions the problem phase space onto multiple processors for massively parallel systems, with statistical uncertainty estimation. The hybrid method development is based on an extension of the FW-CADIS method, which

  3. Hybrid and parallel domain-decomposition methods development to enable Monte Carlo for reactor analyses

    International Nuclear Information System (INIS)

    Wagner, J.C.; Mosher, S.W.; Evans, T.M.; Peplow, D.E.; Turner, J.A.

    2010-01-01

    This paper describes code and methods development at the Oak Ridge National Laboratory focused on enabling high-fidelity, large-scale reactor analyses with Monte Carlo (MC). Current state-of-the-art tools and methods used to perform 'real' commercial reactor analyses have several undesirable features, the most significant of which is the non-rigorous spatial decomposition scheme. Monte Carlo methods, which allow detailed and accurate modeling of the full geometry and are considered the 'gold standard' for radiation transport solutions, are playing an ever-increasing role in correcting and/or verifying the deterministic, multi-level spatial decomposition methodology in current practice. However, the prohibitive computational requirements associated with obtaining fully converged, system-wide solutions restrict the role of MC to benchmarking deterministic results at a limited number of state-points for a limited number of relevant quantities. The goal of this research is to change this paradigm by enabling direct use of MC for full-core reactor analyses. The most significant of the many technical challenges that must be overcome are the slow, non-uniform convergence of system-wide MC estimates and the memory requirements associated with detailed solutions throughout a reactor (problems involving hundreds of millions of different material and tally regions due to fuel irradiation, temperature distributions, and the needs associated with multi-physics code coupling). To address these challenges, our research has focused on the development and implementation of (1) a novel hybrid deterministic/MC method for determining high-precision fluxes throughout the problem space in k-eigenvalue problems and (2) an efficient MC domain-decomposition (DD) algorithm that partitions the problem phase space onto multiple processors for massively parallel systems, with statistical uncertainty estimation. The hybrid method development is based on an extension of the FW-CADIS method

  4. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  5. Long-Term Propagation Statistics and Availability Performance Assessment for Simulated Terrestrial Hybrid FSO/RF System

    Directory of Open Access Journals (Sweden)

    Fiser Ondrej

    2011-01-01

    Full Text Available Long-term monthly and annual statistics of the attenuation of electromagnetic waves that have been obtained from 6 years of measurements on a free space optical path, 853 meters long, with a wavelength of 850 nm and on a precisely parallel radio path with a frequency of 58 GHz are presented. All the attenuation events observed are systematically classified according to the hydrometeor type causing the particular event. Monthly and yearly propagation statistics on the free space optical path and radio path are obtained. The influence of individual hydrometeors on attenuation is analysed. The obtained propagation statistics are compared to the calculated statistics using ITU-R models. The calculated attenuation statistics both at 850 nm and 58 GHz underestimate the measured statistics for higher attenuation levels. The availability performance of a simulated hybrid FSO/RF system is analysed based on the measured data.

  6. Statistical reporting errors and collaboration on statistical analyses in psychological science

    NARCIS (Netherlands)

    Veldkamp, C.L.S.; Nuijten, M.B.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we

  7. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  8. Genome reorganization in Nicotiana asymmetric somatic hybrids analysed by in situ hybridization

    International Nuclear Information System (INIS)

    Parokonny, A.S.; Kenton, A.Y.; Gleba, Y.Y.; Bennett, M.D.

    1992-01-01

    In situ hybridization was used to examine genome reorganization in asymmetric somatic hybrids between Nicotiana plumbaginifolia and Nicotiana sylvestris obtained by fusion of gamma-irradiated protoplasts from one of the parents (donor) with non-irradiated protoplasts from the other (recipient). Probing with biotinylated total genomic DNA from either the donor or the recipient species unequivocally identified genetic material from both parents in 31 regenerant plants, each originating from a different nuclear hybrid colony. This method, termed genomic in situ hybridization (GISH), allowed intergenomic translocations containing chromosome segments from both species to be recognized in four regenerants. A probe homologous to the consensus sequence of the Arabidopsis thaliana telomeric repeat (5'-TTTAGGG-3')n, identified telomeres on all chromosomes, including 'mini-chromosomes' originating from the irradiated donor genome. Genomic in situ hybridization to plant chromosomes provides a rapid and reliable means of screening for recombinant genotypes in asymmetric somatic hybrids. Used in combination with other DNA probes, it also contributes to a greater understanding of the events responsible for genomic recovery and restabilization following genetic manipulation in vitro

  9. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  10. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  11. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  12. A DNA microarray-based methylation-sensitive (MS)-AFLP hybridization method for genetic and epigenetic analyses.

    Science.gov (United States)

    Yamamoto, F; Yamamoto, M

    2004-07-01

    We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.

  13. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  14. Non-Poisson counting statistics of a hybrid G-M counter dead time model

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Jae, Moosung; Gardner, Robin P.

    2007-01-01

    The counting statistics of a G-M counter with a considerable dead time event rate deviates from Poisson statistics. Important characteristics such as observed counting rates as a function true counting rates, variances and interval distributions were analyzed for three dead time models, non-paralyzable, paralyzable and hybrid, with the help of GMSIM, a Monte Carlo dead time effect simulator. The simulation results showed good agreements with the models in observed counting rates and variances. It was found through GMSIM simulations that the interval distribution for the hybrid model showed three distinctive regions, a complete cutoff region for the duration of the total dead time, a degraded exponential and an enhanced exponential regions. By measuring the cutoff and the duration of degraded exponential from the pulse interval distribution, it is possible to evaluate the two dead times in the hybrid model

  15. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  16. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    Science.gov (United States)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  17. Scripts for TRUMP data analyses. Part II (HLA-related data): statistical analyses specific for hematopoietic stem cell transplantation.

    Science.gov (United States)

    Kanda, Junya

    2016-01-01

    The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.

  18. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  19. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  20. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Genomic Analyses Reveal the Influence of Geographic Origin, Migration, and Hybridization on Modern Dog Breed Development

    Directory of Open Access Journals (Sweden)

    Heidi G. Parker

    2017-04-01

    Full Text Available There are nearly 400 modern domestic dog breeds with a unique histories and genetic profiles. To track the genetic signatures of breed development, we have assembled the most diverse dataset of dog breeds, reflecting their extensive phenotypic variation and heritage. Combining genetic distance, migration, and genome-wide haplotype sharing analyses, we uncover geographic patterns of development and independent origins of common traits. Our analyses reveal the hybrid history of breeds and elucidate the effects of immigration, revealing for the first time a suggestion of New World dog within some modern breeds. Finally, we used cladistics and haplotype sharing to show that some common traits have arisen more than once in the history of the dog. These analyses characterize the complexities of breed development, resolving longstanding questions regarding individual breed origination, the effect of migration on geographically distinct breeds, and, by inference, transfer of trait and disease alleles among dog breeds.

  2. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  3. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  4. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    International Nuclear Information System (INIS)

    Clerc, F; Njiki-Menga, G-H; Witschger, O

    2013-01-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  5. Integrated approach for fusion multi-physics coupled analyses based on hybrid CAD and mesh geometries

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Yuefeng, E-mail: yuefeng.qiu@kit.edu; Lu, Lei; Fischer, Ulrich

    2015-10-15

    Highlights: • Integrated approach for neutronics, thermal and structural analyses was developed. • MCNP5/6, TRIPOLI-4 were coupled with CFX, Fluent and ANSYS Workbench. • A novel meshing approach has been proposed for describing MC geometry. - Abstract: Coupled multi-physics analyses on fusion reactor devices require high-fidelity neutronic models, and flexible, accurate data exchanging between various calculation codes. An integrated coupling approach has been developed to enable the conversion of CAD, mesh, or hybrid geometries for Monte Carlo (MC) codes MCNP5/6, TRIPOLI-4, and translation of nuclear heating data for CFD codes Fluent, CFX and structural mechanical software ANSYS Workbench. The coupling approach has been implemented based on SALOME platform with CAD modeling, mesh generation and data visualization capabilities. A novel meshing approach has been developed for generating suitable meshes for MC geometry descriptions. The coupling approach has been concluded to be reliable and efficient after verification calculations of several application cases.

  6. Statistical analyses of the data on occupational radiation expousure at JPDR

    International Nuclear Information System (INIS)

    Kato, Shohei; Anazawa, Yutaka; Matsuno, Kenji; Furuta, Toshishiro; Akiyama, Isamu

    1980-01-01

    In the statistical analyses of the data on occupational radiation exposure at JPDR, statistical features were obtained as follows. (1) The individual doses followed log-normal distribution. (2) In the distribution of doses from one job in controlled area, the logarithm of the mean (μ) depended on the exposure rate (γ(mR/h)), and the σ correlated to the nature of the job and normally distributed. These relations were as follows. μ = 0.48 ln r-0.24, σ = 1.2 +- 0.58 (3) For the data containing different groups, the distribution of doses showed a polygonal line on the log-normal probability paper. (4) Under the dose limitation, the distribution of the doses showed asymptotic curve along the limit on the log-normal probability paper. (author)

  7. arXiv Statistical Analyses of Higgs- and Z-Portal Dark Matter Models

    CERN Document Server

    Ellis, John; Marzola, Luca; Raidal, Martti

    2018-06-12

    We perform frequentist and Bayesian statistical analyses of Higgs- and Z-portal models of dark matter particles with spin 0, 1/2 and 1. Our analyses incorporate data from direct detection and indirect detection experiments, as well as LHC searches for monojet and monophoton events, and we also analyze the potential impacts of future direct detection experiments. We find acceptable regions of the parameter spaces for Higgs-portal models with real scalar, neutral vector, Majorana or Dirac fermion dark matter particles, and Z-portal models with Majorana or Dirac fermion dark matter particles. In many of these cases, there are interesting prospects for discovering dark matter particles in Higgs or Z decays, as well as dark matter particles weighing $\\gtrsim 100$ GeV. Negative results from planned direct detection experiments would still allow acceptable regions for Higgs- and Z-portal models with Majorana or Dirac fermion dark matter particles.

  8. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    2017-10-01

    Full Text Available Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.

  9. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Science.gov (United States)

    De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep

    2017-01-01

    Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107

  10. New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics

    International Nuclear Information System (INIS)

    Akhmatskaya, Elena; Reich, Sebastian

    2011-01-01

    We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)

  11. Characteristics of electrostatic solitary waves observed in the plasma sheet boundary: Statistical analyses

    Directory of Open Access Journals (Sweden)

    H. Kojima

    1999-01-01

    Full Text Available We present the characteristics of the Electrostatic Solitary Waves (ESW observed by the Geotail spacecraft in the plasma sheet boundary layer based on the statistical analyses. We also discuss the results referring to a model of ESW generation due to electron beams, which is proposed by computer simulations. In this generation model, the nonlinear evolution of Langmuir waves excited by electron bump-on-tail instabilities leads to formation of isolated electrostatic potential structures corresponding to "electron hole" in the phase space. The statistical analyses of the Geotail data, which we conducted under the assumption that polarity of ESW potentials is positive, show that most of ESW propagate in the same direction of electron beams, which are observed by the plasma instrument, simultaneously. Further, we also find that the ESW potential energy is much smaller than the background electron thermal energy and that the ESW potential widths are typically shorter than 60 times of local electron Debye length when we assume that the ESW potentials travel in the same velocity of electron beams. These results are very consistent with the ESW generation model that the nonlinear evolution of electron bump-on-tail instability leads to the formation of electron holes in the phase space.

  12. A weighted U-statistic for genetic association analyses of sequencing data.

    Science.gov (United States)

    Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing

    2014-12-01

    With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.

  13. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    Science.gov (United States)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  14. Multivariate statistical analyses demonstrate unique host immune responses to single and dual lentiviral infection.

    Directory of Open Access Journals (Sweden)

    Sunando Roy

    2009-10-01

    Full Text Available Feline immunodeficiency virus (FIV and human immunodeficiency virus (HIV are recently identified lentiviruses that cause progressive immune decline and ultimately death in infected cats and humans. It is of great interest to understand how to prevent immune system collapse caused by these lentiviruses. We recently described that disease caused by a virulent FIV strain in cats can be attenuated if animals are first infected with a feline immunodeficiency virus derived from a wild cougar. The detailed temporal tracking of cat immunological parameters in response to two viral infections resulted in high-dimensional datasets containing variables that exhibit strong co-variation. Initial analyses of these complex data using univariate statistical techniques did not account for interactions among immunological response variables and therefore potentially obscured significant effects between infection state and immunological parameters.Here, we apply a suite of multivariate statistical tools, including Principal Component Analysis, MANOVA and Linear Discriminant Analysis, to temporal immunological data resulting from FIV superinfection in domestic cats. We investigated the co-variation among immunological responses, the differences in immune parameters among four groups of five cats each (uninfected, single and dual infected animals, and the "immune profiles" that discriminate among them over the first four weeks following superinfection. Dual infected cats mount an immune response by 24 days post superinfection that is characterized by elevated levels of CD8 and CD25 cells and increased expression of IL4 and IFNgamma, and FAS. This profile discriminates dual infected cats from cats infected with FIV alone, which show high IL-10 and lower numbers of CD8 and CD25 cells.Multivariate statistical analyses demonstrate both the dynamic nature of the immune response to FIV single and dual infection and the development of a unique immunological profile in dual

  15. Statistical analyses of digital collections: Using a large corpus of systematic reviews to study non-citations

    DEFF Research Database (Denmark)

    Frandsen, Tove Faber; Nicolaisen, Jeppe

    2017-01-01

    Using statistical methods to analyse digital material for patterns makes it possible to detect patterns in big data that we would otherwise not be able to detect. This paper seeks to exemplify this fact by statistically analysing a large corpus of references in systematic reviews. The aim...

  16. Systematic Mapping and Statistical Analyses of Valley Landform and Vegetation Asymmetries Across Hydroclimatic Gradients

    Science.gov (United States)

    Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.

    2015-12-01

    Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley

  17. Development and validation of an Haemophilus influenzae supragenome hybridization (SGH array for transcriptomic analyses.

    Directory of Open Access Journals (Sweden)

    Benjamin A Janto

    Full Text Available We previously carried out the design and testing of a custom-built Haemophilus influenzae supragenome hybridization (SGH array that contains probe sequences to 2,890 gene clusters identified by whole genome sequencing of 24 strains of H. influenzae. The array was originally designed as a tool to interrogate the gene content of large numbers of clinical isolates without the need for sequencing, however, the data obtained is quantitative and is thus suitable for transcriptomic analyses. In the current study RNA was extracted from H. influenzae strain CZ4126/02 (which was not included in the design of the array converted to cDNA, and labelled and hybridized to the SGH arrays to assess the quality and reproducibility of data obtained from these custom-designed chips to serve as a tool for transcriptomics. Three types of experimental replicates were analyzed with all showing very high degrees of correlation, thus validating both the array and the methods used for RNA profiling. A custom filtering pipeline for two-condition unpaired data using five metrics was developed to minimize variability within replicates and to maximize the identification of the most significant true transcriptional differences between two samples. These methods can be extended to transcriptional analysis of other bacterial species utilizing supragenome-based arrays.

  18. Effect of moulding sand on statistically controlled hybrid rapid casting solution for zinc alloys

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rupinder [Guru Nanak Dev Engineering College, Ludhiana (India)

    2010-08-15

    The purpose of the present investigations is to study the effect of moulding sand on decreasing shell wall thickness of mould cavities for economical and statistically controlled hybrid rapid casting solutions (combination of three dimensional printing and conventional sand casting) for zinc alloys. Starting from the identification of component/ benchmark, technological prototypes were produced at different shell wall thicknesses supported by three different types of sands (namely: dry, green and molasses). Prototypes prepared by the proposed process are for assembly check purpose and not for functional validation of the parts. The study suggested that a shell wall with a less than recommended thickness (12mm) is more suitable for dimensional accuracy. The best dimensional accuracy was obtained at 3mm shell wall thickness with green sand. The process was found to be under statistical control

  19. Statistical analyses to support guidelines for marine avian sampling. Final report

    Science.gov (United States)

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

  20. Ulex Europaeus Agglutinin-1 Is a Reliable Taste Bud Marker for In Situ Hybridization Analyses.

    Science.gov (United States)

    Yoshimoto, Joto; Okada, Shinji; Kishi, Mikiya; Misaka, Takumi

    2016-03-01

    Taste signals are received by taste buds. To better understand the taste reception system, expression patterns of taste-related molecules are determined by in situ hybridization (ISH) analyses at the histological level. Nevertheless, even though ISH is essential for determining mRNA expression, few taste bud markers can be applied together with ISH. Ulex europaeus agglutinin-1 (UEA-1) appears to be a reliable murine taste bud marker based on immunohistochemistry (IHC) analyses. However, there is no evidence as to whether UEA-1 can be used for ISH. Thus, the present study evaluated UEA-1 using various histochemical methods, especially ISH. When lectin staining was performed after ISH procedures, UEA-1 clearly labeled taste cellular membranes and distinctly indicated boundaries between taste buds and the surrounding epithelial cells. Additionally, UEA-1 was determined as a taste bud marker not only when used in single-colored ISH but also when employed with double-labeled ISH or during simultaneous detection using IHC and ISH methods. These results suggest that UEA-1 is a useful marker when conducting analyses based on ISH methods. To clarify UEA-1 staining details, multi-fluorescent IHC (together with UEA-1 staining) was examined, resulting in more than 99% of cells being labeled by UEA-1 and overlapping with KCNQ1-expressing cells. © 2016 The Histochemical Society.

  1. Quantitative Evaluation of Hybrid Aspen Xylem and Immunolabeling Patterns Using Image Analysis and Multivariate Statistics

    Directory of Open Access Journals (Sweden)

    David Sandquist

    2015-06-01

    Full Text Available A new method is presented for quantitative evaluation of hybrid aspen genotype xylem morphology and immunolabeling micro-distribution. This method can be used as an aid in assessing differences in genotypes from classic tree breeding studies, as well as genetically engineered plants. The method is based on image analysis, multivariate statistical evaluation of light, and immunofluorescence microscopy images of wood xylem cross sections. The selected immunolabeling antibodies targeted five different epitopes present in aspen xylem cell walls. Twelve down-regulated hybrid aspen genotypes were included in the method development. The 12 knock-down genotypes were selected based on pre-screening by pyrolysis-IR of global chemical content. The multivariate statistical evaluations successfully identified comparative trends for modifications in the down-regulated genotypes compared to the unmodified control, even when no definitive conclusions could be drawn from individual studied variables alone. Of the 12 genotypes analyzed, three genotypes showed significant trends for modifications in both morphology and immunolabeling. Six genotypes showed significant trends for modifications in either morphology or immunocoverage. The remaining three genotypes did not show any significant trends for modification.

  2. Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.

    Science.gov (United States)

    Schmitt, M; Grub, J; Heib, F

    2015-06-01

    Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database

    International Nuclear Information System (INIS)

    Lam, Chio; Zhou, Wenxing

    2016-01-01

    This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.

  4. Transformation (normalization) of slope gradient and surface curvatures, automated for statistical analyses from DEMs

    Science.gov (United States)

    Csillik, O.; Evans, I. S.; Drăguţ, L.

    2015-03-01

    Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.

  5. Novel hybrid Monte Carlo/deterministic technique for shutdown dose rate analyses of fusion energy systems

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2014-01-01

    Highlights: •Develop the novel Multi-Step CADIS (MS-CADIS) hybrid Monte Carlo/deterministic method for multi-step shielding analyses. •Accurately calculate shutdown dose rates using full-scale Monte Carlo models of fusion energy systems. •Demonstrate the dramatic efficiency improvement of the MS-CADIS method for the rigorous two step calculations of the shutdown dose rate in fusion reactors. -- Abstract: The rigorous 2-step (R2S) computational system uses three-dimensional Monte Carlo transport simulations to calculate the shutdown dose rate (SDDR) in fusion reactors. Accurate full-scale R2S calculations are impractical in fusion reactors because they require calculating space- and energy-dependent neutron fluxes everywhere inside the reactor. The use of global Monte Carlo variance reduction techniques was suggested for accelerating the R2S neutron transport calculation. However, the prohibitive computational costs of these approaches, which increase with the problem size and amount of shielding materials, inhibit their ability to accurately predict the SDDR in fusion energy systems using full-scale modeling of an entire fusion plant. This paper describes a novel hybrid Monte Carlo/deterministic methodology that uses the Consistent Adjoint Driven Importance Sampling (CADIS) method but focuses on multi-step shielding calculations. The Multi-Step CADIS (MS-CADIS) methodology speeds up the R2S neutron Monte Carlo calculation using an importance function that represents the neutron importance to the final SDDR. Using a simplified example, preliminary results showed that the use of MS-CADIS enhanced the efficiency of the neutron Monte Carlo simulation of an SDDR calculation by a factor of 550 compared to standard global variance reduction techniques, and that the efficiency enhancement compared to analog Monte Carlo is higher than a factor of 10,000

  6. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  7. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  8. Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.

    Science.gov (United States)

    Deng, Yangqing; Pan, Wei

    2017-12-01

    There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the

  9. Detailed statistical contact angle analyses; "slow moving" drops on inclining silicon-oxide surfaces.

    Science.gov (United States)

    Schmitt, M; Groß, K; Grub, J; Heib, F

    2015-06-01

    Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration

  10. An efficient soil water balance model based on hybrid numerical and statistical methods

    Science.gov (United States)

    Mao, Wei; Yang, Jinzhong; Zhu, Yan; Ye, Ming; Liu, Zhao; Wu, Jingwei

    2018-04-01

    Most soil water balance models only consider downward soil water movement driven by gravitational potential, and thus cannot simulate upward soil water movement driven by evapotranspiration especially in agricultural areas. In addition, the models cannot be used for simulating soil water movement in heterogeneous soils, and usually require many empirical parameters. To resolve these problems, this study derives a new one-dimensional water balance model for simulating both downward and upward soil water movement in heterogeneous unsaturated zones. The new model is based on a hybrid of numerical and statistical methods, and only requires four physical parameters. The model uses three governing equations to consider three terms that impact soil water movement, including the advective term driven by gravitational potential, the source/sink term driven by external forces (e.g., evapotranspiration), and the diffusive term driven by matric potential. The three governing equations are solved separately by using the hybrid numerical and statistical methods (e.g., linear regression method) that consider soil heterogeneity. The four soil hydraulic parameters required by the new models are as follows: saturated hydraulic conductivity, saturated water content, field capacity, and residual water content. The strength and weakness of the new model are evaluated by using two published studies, three hypothetical examples and a real-world application. The evaluation is performed by comparing the simulation results of the new model with corresponding results presented in the published studies, obtained using HYDRUS-1D and observation data. The evaluation indicates that the new model is accurate and efficient for simulating upward soil water flow in heterogeneous soils with complex boundary conditions. The new model is used for evaluating different drainage functions, and the square drainage function and the power drainage function are recommended. Computational efficiency of the new

  11. Dispensing processes impact apparent biological activity as determined by computational and statistical analyses.

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.

  12. Authigenic oxide Neodymium Isotopic composition as a proxy of seawater: applying multivariate statistical analyses.

    Science.gov (United States)

    McKinley, C. C.; Scudder, R.; Thomas, D. J.

    2016-12-01

    The Neodymium Isotopic composition (Nd IC) of oxide coatings has been applied as a tracer of water mass composition and used to address fundamental questions about past ocean conditions. The leached authigenic oxide coating from marine sediment is widely assumed to reflect the dissolved trace metal composition of the bottom water interacting with sediment at the seafloor. However, recent studies have shown that readily reducible sediment components, in addition to trace metal fluxes from the pore water, are incorporated into the bottom water, influencing the trace metal composition of leached oxide coatings. This challenges the prevailing application of the authigenic oxide Nd IC as a proxy of seawater composition. Therefore, it is important to identify the component end-members that create sediments of different lithology and determine if, or how they might contribute to the Nd IC of oxide coatings. To investigate lithologic influence on the results of sequential leaching, we selected two sites with complete bulk sediment statistical characterization. Site U1370 in the South Pacific Gyre, is predominantly composed of Rhyolite ( 60%) and has a distinguishable ( 10%) Fe-Mn Oxyhydroxide component (Dunlea et al., 2015). Site 1149 near the Izu-Bonin-Arc is predominantly composed of dispersed ash ( 20-50%) and eolian dust from Asia ( 50-80%) (Scudder et al., 2014). We perform a two-step leaching procedure: a 14 mL of 0.02 M hydroxylamine hydrochloride (HH) in 20% acetic acid buffered to a pH 4 for one hour, targeting metals bound to Fe- and Mn- oxides fractions, and a second HH leach for 12 hours, designed to remove any remaining oxides from the residual component. We analyze all three resulting fractions for a large suite of major, trace and rare earth elements, a sub-set of the samples are also analyzed for Nd IC. We use multivariate statistical analyses of the resulting geochemical data to identify how each component of the sediment partitions across the sequential

  13. Review of Statistical Analyses Resulting from Performance of HLDWD-DWPF-005

    International Nuclear Information System (INIS)

    Beck, R.S.

    1997-01-01

    The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1 ampersand 2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of

  14. Correlating tephras and cryptotephras using glass compositional analyses and numerical and statistical methods: Review and evaluation

    Science.gov (United States)

    Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.

    2017-11-01

    We define tephras and cryptotephras and their components (mainly ash-sized particles of glass ± crystals in distal deposits) and summarize the basis of tephrochronology as a chronostratigraphic correlational and dating tool for palaeoenvironmental, geological, and archaeological research. We then document and appraise recent advances in analytical methods used to determine the major, minor, and trace elements of individual glass shards from tephra or cryptotephra deposits to aid their correlation and application. Protocols developed recently for the electron probe microanalysis of major elements in individual glass shards help to improve data quality and standardize reporting procedures. A narrow electron beam (diameter ∼3-5 μm) can now be used to analyze smaller glass shards than previously attainable. Reliable analyses of 'microshards' (defined here as glass shards T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain multiple compositional groups. Currently, incomplete databases are tending to limit correlation efficacy. The development of an open, online global database to facilitate progress towards integrated, high

  15. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  16. Interspecies introgressive hybridization in spiny frogs Quasipaa (Family Dicroglossidae) revealed by analyses on multiple mitochondrial and nuclear genes.

    Science.gov (United States)

    Zhang, Qi-Peng; Hu, Wen-Fang; Zhou, Ting-Ting; Kong, Shen-Shen; Liu, Zhi-Fang; Zheng, Rong-Quan

    2018-01-01

    Introgression may lead to discordant patterns of variation among loci and traits. For example, previous phylogeographic studies on the genus Quasipaa detected signs of genetic introgression from genetically and morphologically divergent Quasipaa shini or Quasipaa spinosa . In this study, we used mitochondrial and nuclear DNA sequence data to verify the widespread introgressive hybridization in the closely related species of the genus Quasipaa , evaluate the level of genetic diversity, and reveal the formation mechanism of introgressive hybridization. In Longsheng, Guangxi Province, signs of asymmetrical nuclear introgression were detected between Quasipaa boulengeri and Q. shini . Unidirectional mitochondrial introgression was revealed from Q. spinosa to Q. shini . By contrast, bidirectional mitochondrial gene introgression was detected between Q. spinosa and Q. shini in Lushan, Jiangxi Province. Our study also detected ancient hybridizations between a female Q. spinosa and a male Q. jiulongensis in Zhejiang Province. Analyses on mitochondrial and nuclear genes verified three candidate cryptic species in Q. spinosa , and a cryptic species may also exist in Q. boulengeri . However, no evidence of introgressive hybridization was found between Q. spinosa and Q. boulengeri . Quasipaa exilispinosa from all the sampling localities appeared to be deeply divergent from other communities. Our results suggest widespread introgressive hybridization in closely related species of Quasipaa and provide a fundamental basis for illumination of the forming mechanism of introgressive hybridization, classification of species, and biodiversity assessment in Quasipaa .

  17. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    number of zeros. Using QQ plots these data characteristics show a lack of normality from the data after contamination. Normality is improved when looking at log(CFU/cm2). Variance component analysis (VCA) and analysis of variance (ANOVA) were used to estimate the amount of variance due to each source and to determine which sources of variability were statistically significant. In general, the sampling methods interacted with the across event variability and with the across room variability. For this reason, it was decided to do analyses for each sampling method, individually. The between event variability and between room variability were significant for each method, except for the between event variability for the swabs. For both the wipes and vacuums, the within room standard deviation was much larger (26.9 for wipes and 7.086 for vacuums) than the between event standard deviation (6.552 for wipes and 1.348 for vacuums) and the between room standard deviation (6.783 for wipes and 1.040 for vacuums). Swabs between room standard deviation was 0.151, while both the within room and between event standard deviations are less than 0.10 (all measurements in CFU/cm2).

  18. Essentials of Excel, Excel VBA, SAS and Minitab for statistical and financial analyses

    CERN Document Server

    Lee, Cheng-Few; Chang, Jow-Ran; Tai, Tzu

    2016-01-01

    This introductory textbook for business statistics teaches statistical analysis and research methods via business case studies and financial data using Excel, MINITAB, and SAS. Every chapter in this textbook engages the reader with data of individual stock, stock indices, options, and futures. One studies and uses statistics to learn how to study, analyze, and understand a data set of particular interest. Some of the more popular statistical programs that have been developed to use statistical and computational methods to analyze data sets are SAS, SPSS, and MINITAB. Of those, we look at MINITAB and SAS in this textbook. One of the main reasons to use MINITAB is that it is the easiest to use among the popular statistical programs. We look at SAS because it is the leading statistical package used in industry. We also utilize the much less costly and ubiquitous Microsoft Excel to do statistical analysis, as the benefits of Excel have become widely recognized in the academic world and its analytical capabilities...

  19. Analysis of Norwegian bio energy statistics. Quality improvement proposals; Analyse av norsk bioenergistatistikk. Forslag til kvalitetsheving

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This report is an assessment of the current model and presentation form of bio energy statistics. It appears proposed revision and enhancement of both collection and data representation. In the context of market development both in general for energy and particularly for bio energy and government targets, a good bio energy statistics form the basis to follow up the objectives and means.(eb)

  20. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Science.gov (United States)

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  1. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses

    OpenAIRE

    Buttigieg, Pier Luigi; Ramette, Alban Nicolas

    2014-01-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynami...

  2. Reticulate evolution: frequent introgressive hybridization among chinese hares (genus lepus revealed by analyses of multiple mitochondrial and nuclear DNA loci

    Directory of Open Access Journals (Sweden)

    Wu Shi-Fang

    2011-07-01

    Full Text Available Abstract Background Interspecific hybridization may lead to the introgression of genes and genomes across species barriers and contribute to a reticulate evolutionary pattern and thus taxonomic uncertainties. Since several previous studies have demonstrated that introgressive hybridization has occurred among some species within Lepus, therefore it is possible that introgressive hybridization events also occur among Chinese Lepus species and contribute to the current taxonomic confusion. Results Data from four mtDNA genes, from 116 individuals, and one nuclear gene, from 119 individuals, provides the first evidence of frequent introgression events via historical and recent interspecific hybridizations among six Chinese Lepus species. Remarkably, the mtDNA of L. mandshuricus was completely replaced by mtDNA from L. timidus and L. sinensis. Analysis of the nuclear DNA sequence revealed a high proportion of heterozygous genotypes containing alleles from two divergent clades and that several haplotypes were shared among species, suggesting repeated and recent introgression. Furthermore, results from the present analyses suggest that Chinese hares belong to eight species. Conclusion This study provides a framework for understanding the patterns of speciation and the taxonomy of this clade. The existence of morphological intermediates and atypical mitochondrial gene genealogies resulting from frequent hybridization events likely contribute to the current taxonomic confusion of Chinese hares. The present study also demonstrated that nuclear gene sequence could offer a powerful complementary data set with mtDNA in tracing a complete evolutionary history of recently diverged species.

  3. A hybrid finite element - statistical energy analysis approach to robust sound transmission modeling

    Science.gov (United States)

    Reynders, Edwin; Langley, Robin S.; Dijckmans, Arne; Vermeir, Gerrit

    2014-09-01

    When considering the sound transmission through a wall in between two rooms, in an important part of the audio frequency range, the local response of the rooms is highly sensitive to uncertainty in spatial variations in geometry, material properties and boundary conditions, which have a wave scattering effect, while the local response of the wall is rather insensitive to such uncertainty. For this mid-frequency range, a computationally efficient modeling strategy is adopted that accounts for this uncertainty. The partitioning wall is modeled deterministically, e.g. with finite elements. The rooms are modeled in a very efficient, nonparametric stochastic way, as in statistical energy analysis. All components are coupled by means of a rigorous power balance. This hybrid strategy is extended so that the mean and variance of the sound transmission loss can be computed as well as the transition frequency that loosely marks the boundary between low- and high-frequency behavior of a vibro-acoustic component. The method is first validated in a simulation study, and then applied for predicting the airborne sound insulation of a series of partition walls of increasing complexity: a thin plastic plate, a wall consisting of gypsum blocks, a thicker masonry wall and a double glazing. It is found that the uncertainty caused by random scattering is important except at very high frequencies, where the modal overlap of the rooms is very high. The results are compared with laboratory measurements, and both are found to agree within the prediction uncertainty in the considered frequency range.

  4. A simple and robust statistical framework for planning, analysing and interpreting faecal egg count reduction test (FECRT) studies

    DEFF Research Database (Denmark)

    Denwood, M.J.; McKendrick, I.J.; Matthews, L.

    Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...

  5. The Relationship Between Radiative Forcing and Temperature. What Do Statistical Analyses of the Instrumental Temperature Record Measure?

    International Nuclear Information System (INIS)

    Kaufmann, R.K.; Kauppi, H.; Stock, J.H.

    2006-01-01

    Comparing statistical estimates for the long-run temperature effect of doubled CO2 with those generated by climate models begs the question, is the long-run temperature effect of doubled CO2 that is estimated from the instrumental temperature record using statistical techniques consistent with the transient climate response, the equilibrium climate sensitivity, or the effective climate sensitivity. Here, we attempt to answer the question, what do statistical analyses of the observational record measure, by using these same statistical techniques to estimate the temperature effect of a doubling in the atmospheric concentration of carbon dioxide from seventeen simulations run for the Coupled Model Intercomparison Project 2 (CMIP2). The results indicate that the temperature effect estimated by the statistical methodology is consistent with the transient climate response and that this consistency is relatively unaffected by sample size or the increase in radiative forcing in the sample

  6. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  7. Exergoeconomic and enviroeconomic analyses of hybrid double slope solar still loaded with nanofluids

    International Nuclear Information System (INIS)

    Sahota, Lovedeep; Tiwari, G.N.

    2017-01-01

    Highlights: • Two systems of double slope solar still loaded with three different water based nanofluid have been studied. • Concentration of assisting metallic nanoparticles and basin fluid has been optimized for the annual analysis. • Based on annual performance exergoeconomic and enviroeconomic analysis for both systems has been performed. - Abstract: In recent times, incorporation of nanotechnology in solar distillation systems for potable water production is a new approach harvesting solar thermal energy. In present manuscript, concentration of assisting nanoparticles and basin fluid (basefluid/nanofluid) mass have been optimized for hybrid solar still operating (a) without heat exchanger (system A), and (b) with helically coiled heat exchanger (system B). Corresponding to the optimized parameters, overall thermal energy, exergy, productivity (yield), and cost analysis of the proposed hybrid systems loaded with water based nanofluids have been carried out; and found to be significantly improved by incorporating copper oxide-water based nanofluid. Moreover, on the basis of overall thermal energy and exergy, the amount of carbon dioxide mitigated per annum is found to be 14.95 tones and 3.17 tones respectively for the hybrid system (A); whereas, it is found to be 24.61 tones and 2.36 tones respectively for the hybrid system (B) incorporating copper oxide-water based nanofluid. Annual performance of the proposed hybrid systems has been compared with the conventional solar still (system C).

  8. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    International Nuclear Information System (INIS)

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-01-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180 degrees and 120 degrees symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements

  9. "Who Was 'Shadow'?" The Computer Knows: Applying Grammar-Program Statistics in Content Analyses to Solve Mysteries about Authorship.

    Science.gov (United States)

    Ellis, Barbara G.; Dick, Steven J.

    1996-01-01

    Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)

  10. Design and implementation of a modular program system for the carrying-through of statistical analyses

    International Nuclear Information System (INIS)

    Beck, W.

    1984-01-01

    From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de

  11. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  12. Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh

    transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...

  13. Statistic analyses of the color experience according to the age of the observer.

    Science.gov (United States)

    Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita

    2013-04-01

    Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).

  14. Statistical analyses of the performance of Macedonian investment and pension funds

    Directory of Open Access Journals (Sweden)

    Petar Taleski

    2015-10-01

    Full Text Available The foundation of the post-modern portfolio theory is creating a portfolio based on a desired target return. This specifically applies to the performance of investment and pension funds that provide a rate of return meeting payment requirements from investment funds. A desired target return is the goal of an investment or pension fund. It is the primary benchmark used to measure performances, dynamic monitoring and evaluation of the risk–return ratio on investment funds. The analysis in this paper is based on monthly returns of Macedonian investment and pension funds (June 2011 - June 2014. Such analysis utilizes the basic, but highly informative statistical characteristic moments like skewness, kurtosis, Jarque–Bera, and Chebyishev’s Inequality. The objective of this study is to perform a trough analysis, utilizing the above mentioned and other types of statistical techniques (Sharpe, Sortino, omega, upside potential, Calmar, Sterling to draw relevant conclusions regarding the risks and characteristic moments in Macedonian investment and pension funds. Pension funds are the second largest segment of the financial system, and has great potential for further growth due to constant inflows from pension insurance. The importance of investment funds for the financial system in the Republic of Macedonia is still small, although open-end investment funds have been the fastest growing segment of the financial system. Statistical analysis has shown that pension funds have delivered a significantly positive volatility-adjusted risk premium in the analyzed period more so than investment funds.

  15. Robust statistics for deterministic and stochastic gravitational waves in non-Gaussian noise. II. Bayesian analyses

    International Nuclear Information System (INIS)

    Allen, Bruce; Creighton, Jolien D.E.; Flanagan, Eanna E.; Romano, Joseph D.

    2003-01-01

    In a previous paper (paper I), we derived a set of near-optimal signal detection techniques for gravitational wave detectors whose noise probability distributions contain non-Gaussian tails. The methods modify standard methods by truncating or clipping sample values which lie in those non-Gaussian tails. The methods were derived, in the frequentist framework, by minimizing false alarm probabilities at fixed false detection probability in the limit of weak signals. For stochastic signals, the resulting statistic consisted of a sum of an autocorrelation term and a cross-correlation term; it was necessary to discard 'by hand' the autocorrelation term in order to arrive at the correct, generalized cross-correlation statistic. In the present paper, we present an alternative derivation of the same signal detection techniques from within the Bayesian framework. We compute, for both deterministic and stochastic signals, the probability that a signal is present in the data, in the limit where the signal-to-noise ratio squared per frequency bin is small, where the signal is nevertheless strong enough to be detected (integrated signal-to-noise ratio large compared to 1), and where the total probability in the non-Gaussian tail part of the noise distribution is small. We show that, for each model considered, the resulting probability is to a good approximation a monotonic function of the detection statistic derived in paper I. Moreover, for stochastic signals, the new Bayesian derivation automatically eliminates the problematic autocorrelation term

  16. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  17. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  18. The Use of Statistical Process Control Tools for Analysing Financial Statements

    Directory of Open Access Journals (Sweden)

    Niezgoda Janusz

    2017-06-01

    Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.

  19. Statistical methods for analysing the relationship between bank profitability and liquidity

    OpenAIRE

    Boguslaw Guzik

    2006-01-01

    The article analyses the most popular methods for the empirical estimation of the relationship between bank profitability and liquidity. Owing to the fact that profitability depends on various factors (both economic and non-economic), a simple correlation coefficient, two-dimensional (profitability/liquidity) graphs or models where profitability depends only on liquidity variable do not provide good and reliable results. Quite good results can be obtained only when multifactorial profitabilit...

  20. Thermodynamic analyses of solar thermal gasification of coal for hybrid solar-fossil power and fuel production

    International Nuclear Information System (INIS)

    Ng, Yi Cheng; Lipiński, Wojciech

    2012-01-01

    Thermodynamic analyses are performed for solar thermal steam and dry gasification of coal. The selected types of coal are anthracite, bituminous, lignite and peat. Two model conversion paths are considered for each combination of the gasifying agent and the coal type: production of the synthesis gas with its subsequent use in a combined cycle power plant to generate power, and production of the synthesis gas with its subsequent use to produce gasoline via the Fischer–Tropsch synthesis. Replacement of a coal-fired 35% efficient Rankine cycle power plant and a combustion-based integrated gasification combined cycle power plant by a solar-based integrated gasification combined cycle power plant leads to the reduction in specific carbon dioxide emissions by at least 47% and 27%, respectively. Replacement of a conventional gasoline production process via coal gasification and a subsequent Fischer–Tropsch synthesis with gasoline production via solar thermal coal gasification with a subsequent Fischer–Tropsch synthesis leads to the reduction in specific carbon dioxide emissions by at least 39%. -- Highlights: ► Thermodynamic analyses for steam and dry gasification of coal are presented. ► Hybrid solar-fossil paths to power and fuels are compared to those using only combustion. ► Hybrid power production can reduce specific CO 2 emissions by more than 27%. ► Hybrid fuel production can reduce specific CO 2 emissions by more than 39%.

  1. Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.

    Science.gov (United States)

    Banks, N C; Hodda, M; Singh, S K; Matveeva, E M

    2012-06-01

    Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.

  2. Statistical Analyses and Modeling of the Implementation of Agile Manufacturing Tactics in Industrial Firms

    Directory of Open Access Journals (Sweden)

    Mohammad D. AL-Tahat

    2012-01-01

    Full Text Available This paper provides a review and introduction on agile manufacturing. Tactics of agile manufacturing are mapped into different production areas (eight-construct latent: manufacturing equipment and technology, processes technology and know-how, quality and productivity improvement, production planning and control, shop floor management, product design and development, supplier relationship management, and customer relationship management. The implementation level of agile manufacturing tactics is investigated in each area. A structural equation model is proposed. Hypotheses are formulated. Feedback from 456 firms is collected using five-point-Likert-scale questionnaire. Statistical analysis is carried out using IBM SPSS and AMOS. Multicollinearity, content validity, consistency, construct validity, ANOVA analysis, and relationships between agile components are tested. The results of this study prove that the agile manufacturing tactics have positive effect on the overall agility level. This conclusion can be used by manufacturing firms to manage challenges when trying to be agile.

  3. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  4. Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

    Energy Technology Data Exchange (ETDEWEB)

    Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J

    2007-10-24

    Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples

  5. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  6. Statistical analyses of variability/reproducibility of environmentally assisted cyclic crack growth rate data utilizing JAERI Material Performance Database (JMPD)

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo

    1993-05-01

    Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type ones from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air. (author)

  7. On statistical methods for analysing the geographical distribution of cancer cases near nuclear installations

    International Nuclear Information System (INIS)

    Bithell, J.F.; Stone, R.A.

    1989-01-01

    This paper sets out to show that epidemiological methods most commonly used can be improved. When analysing geographical data it is necessary to consider location. The most obvious quantification of location is ranked distance, though other measures which may be more meaningful in relation to aetiology may be substituted. A test based on distance ranks, the ''Poisson maximum test'', depends on the maximum of observed relative risk in regions of increasing size, but with significance level adjusted for selection. Applying this test to data from Sellafield and Sizewell shows that the excess of leukaemia incidence observed at Seascale, near Sellafield, is not an artefact due to data selection by region, and that the excess probably results from a genuine, if as yet unidentified cause (there being little evidence of any other locational association once the Seascale cases have been removed). So far as Sizewell is concerned, geographical proximity to the nuclear power station does not seen particularly important. (author)

  8. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    Science.gov (United States)

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.

  9. Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.

    Science.gov (United States)

    Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi

    2013-12-01

    Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.

  10. Lipid-polymer hybrid nanoparticles: Development & statistical optimization of norfloxacin for topical drug delivery system

    Directory of Open Access Journals (Sweden)

    Vivek Dave

    2017-12-01

    Full Text Available Poly lactic acid is a biodegradable, biocompatible, and non-toxic polymer, widely used in many pharmaceutical preparations such as controlled release formulations, parenteral preparations, surgical treatment applications, and tissue engineering. In this study, we prepared lipid-polymer hybrid nanoparticles for topical and site targeting delivery of Norfloxacin by emulsification solvent evaporation method (ESE. The design of experiment (DOE was done by using software to optimize the result, and then a surface plot was generated to compare with the practical results. The surface morphology, particle size, zeta potential and composition of the lipid-polymer hybrid nanoparticles were characterized by SEM, TEM, AFM, and FTIR. The thermal behavior of the lipid-polymer hybrid nanoparticles was characterized by DSC and TGA. The prepared lipid-polymer hybrid nanoparticles of Norfloxacin exhibited an average particle size from 178.6 ± 3.7 nm to 220.8 ± 2.3 nm, and showed very narrow distribution with polydispersity index ranging from 0.206 ± 0.36 to 0.383 ± 0.66. The surface charge on the lipid-polymer hybrid nanoparticles were confirmed by zeta potential, showed the value from +23.4 ± 1.5 mV to +41.5 ± 3.4 mV. An Antimicrobial study was done against Staphylococcus aureus and Pseudomonas aeruginosa, and the lipid-polymer hybrid nanoparticles showed potential activity against these two. Lipid-polymer hybrid nanoparticles of Norfloxacin showed the %cumulative drug release of 89.72% in 24 h. A stability study of the optimized formulation showed the suitable condition for the storage of lipid-polymer hybrid nanoparticles was at 4 ± 2 °C/60 ± 5% RH. These results illustrated high potential of lipid-polymer hybrid nanoparticles Norfloxacin for usage as a topical antibiotic drug carriers.

  11. STATISTIC, PROBABILISTIC, CORRELATION AND SPECTRAL ANALYSES OF REGENERATIVE BRAKING CURRENT OF DC ELECTRIC ROLLING STOCK

    Directory of Open Access Journals (Sweden)

    A. V. Nikitenko

    2014-04-01

    Full Text Available Purpose. Defining and analysis of the probabilistic and spectral characteristics of random current in regenerative braking mode of DC electric rolling stock are observed in this paper. Methodology. The elements and methods of the probability theory (particularly the theory of stationary and non-stationary processes and methods of the sampling theory are used for processing of the regenerated current data arrays by PC. Findings. The regenerated current records are obtained from the locomotives and trains in Ukraine railways and trams in Poland. It was established that the current has uninterrupted and the jumping variations in time (especially in trams. For the random current in the regenerative braking mode the functions of mathematical expectation, dispersion and standard deviation are calculated. Histograms, probabilistic characteristics and correlation functions are calculated and plotted down for this current too. It was established that the current of the regenerative braking mode can be considered like the stationary and non-ergodic process. The spectral analysis of these records and “tail part” of the correlation function found weak periodical (or low-frequency components which are known like an interharmonic. Originality. Firstly, the theory of non-stationary random processes was adapted for the analysis of the recuperated current which has uninterrupted and the jumping variations in time. Secondly, the presence of interharmonics in the stochastic process of regenerated current was defined for the first time. And finally, the patterns of temporal changes of the correlation current function are defined too. This allows to reasonably apply the correlation functions method in the identification of the electric traction system devices. Practical value. The results of probabilistic and statistic analysis of the recuperated current allow to estimate the quality of recovered energy and energy quality indices of electric rolling stock in the

  12. Measurements and statistical analyses of indoor radon concentrations in Tokyo and surrounding areas

    International Nuclear Information System (INIS)

    Sugiura, Shiroharu; Suzuki, Takashi; Inokoshi, Yukio

    1995-01-01

    Since the UNSCEAR report published in 1982, radiation exposure to the respiratory tract due to radon and its progeny has been regarded as the single largest contributor to the natural radiation exposure of the general public. In Japan, the measurement of radon gas concentrations in many types of buildings have been surveyed by national and private institutes. We also carried out the measurement of radon gas concentrations in different types of residential buildings in Tokyo and its adjoining prefectures from October 1988 to September 1991, to evaluate the potential radiation risk of the people living there. One or two simplified passive radon monitors were set up in each of the 34 residential buildings located in the above-mentioned area for an exposure period of 3 months each. Comparing the average concentrations in the buildings of different materials and structures, those in the concrete steel buildings were always higher than those in the wooden and the prefabricated mortared buildings. The radon concentrations were proved to become higher in autumn and winter, and lower in spring and summer. Radon concentrations in an underground room of a concrete steel building showed the highest value throughout our investigation, and statistically significant seasonal variation was detected by the X-11 method developed by the U.S. Bureau of Census. The values measured in a room at the first floor of the same concrete steel building also showed seasonal variation, but the phase of variation was different. Another multivariate analysis suggested that the building material and structure are the most important factors concerning the levels of radon concentration among other factors such as the age of the building and the use of ventilators. (author)

  13. Computational and Statistical Analyses of Insertional Polymorphic Endogenous Retroviruses in a Non-Model Organism

    Directory of Open Access Journals (Sweden)

    Le Bao

    2014-11-01

    Full Text Available Endogenous retroviruses (ERVs are a class of transposable elements found in all vertebrate genomes that contribute substantially to genomic functional and structural diversity. A host species acquires an ERV when an exogenous retrovirus infects a germ cell of an individual and becomes part of the genome inherited by viable progeny. ERVs that colonized ancestral lineages are fixed in contemporary species. However, in some extant species, ERV colonization is ongoing, which results in variation in ERV frequency in the population. To study the consequences of ERV colonization of a host genome, methods are needed to assign each ERV to a location in a species’ genome and determine which individuals have acquired each ERV by descent. Because well annotated reference genomes are not widely available for all species, de novo clustering approaches provide an alternative to reference mapping that are insensitive to differences between query and reference and that are amenable to mobile element studies in both model and non-model organisms. However, there is substantial uncertainty in both identifying ERV genomic position and assigning each unique ERV integration site to individuals in a population. We present an analysis suitable for detecting ERV integration sites in species without the need for a reference genome. Our approach is based on improved de novo clustering methods and statistical models that take the uncertainty of assignment into account and yield a probability matrix of shared ERV integration sites among individuals. We demonstrate that polymorphic integrations of a recently identified endogenous retrovirus in deer reflect contemporary relationships among individuals and populations.

  14. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  15. Quantitative characterization of colloidal assembly of graphene oxide-silver nanoparticle hybrids using aerosol differential mobility-coupled mass analyses.

    Science.gov (United States)

    Nguyen, Thai Phuong; Chang, Wei-Chang; Lai, Yen-Chih; Hsiao, Ta-Chih; Tsai, De-Hao

    2017-10-01

    In this work, we develop an aerosol-based, time-resolved ion mobility-coupled mass characterization method to investigate colloidal assembly of graphene oxide (GO)-silver nanoparticle (AgNP) hybrid nanostructure on a quantitative basis. Transmission electron microscopy (TEM) and zeta potential (ZP) analysis were used to provide visual information and elemental-based particle size distributions, respectively. Results clearly show a successful controlled assembly of GO-AgNP by electrostatic-directed heterogeneous aggregation between GO and bovine serum albumin (BSA)-functionalized AgNP under an acidic environment. Additionally, physical size, mass, and conformation (i.e., number of AgNP per nanohybrid) of GO-AgNP were shown to be proportional to the number concentration ratio of AgNP to GO (R) and the selected electrical mobility diameter. An analysis of colloidal stability of GO-AgNP indicates that the stability increased with its absolute ZP, which was dependent on R and environmental pH. The work presented here provides a proof of concept for systematically synthesizing hybrid colloidal nanomaterials through the tuning of surface chemistry in aqueous phase with the ability in quantitative characterization. Graphical Abstract Colloidal assembly of graphene oxide-silver nanoparticle hybrids characterized by aerosol differential mobility-coupled mass analyses.

  16. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. Analysing the Severity and Frequency of Traffic Crashes in Riyadh City Using Statistical Models

    Directory of Open Access Journals (Sweden)

    Saleh Altwaijri

    2012-12-01

    Full Text Available Traffic crashes in Riyadh city cause losses in the form of deaths, injuries and property damages, in addition to the pain and social tragedy affecting families of the victims. In 2005, there were a total of 47,341 injury traffic crashes occurred in Riyadh city (19% of the total KSA crashes and 9% of those crashes were severe. Road safety in Riyadh city may have been adversely affected by: high car ownership, migration of people to Riyadh city, high daily trips reached about 6 million, high rate of income, low-cost of petrol, drivers from different nationalities, young drivers and tremendous growth in population which creates a high level of mobility and transport activities in the city. The primary objective of this paper is therefore to explore factors affecting the severity and frequency of road crashes in Riyadh city using appropriate statistical models aiming to establish effective safety policies ready to be implemented to reduce the severity and frequency of road crashes in Riyadh city. Crash data for Riyadh city were collected from the Higher Commission for the Development of Riyadh (HCDR for a period of five years from 1425H to 1429H (roughly corresponding to 2004-2008. Crash data were classified into three categories: fatal, serious-injury and slight-injury. Two nominal response models have been developed: a standard multinomial logit model (MNL and a mixed logit model to injury-related crash data. Due to a severe underreporting problem on the slight injury crashes binary and mixed binary logistic regression models were also estimated for two categories of severity: fatal and serious crashes. For frequency, two count models such as Negative Binomial (NB models were employed and the unit of analysis was 168 HAIs (wards in Riyadh city. Ward-level crash data are disaggregated by severity of the crash (such as fatal and serious injury crashes. The results from both multinomial and binary response models are found to be fairly consistent but

  20. Scoping and sensitivity analyses for the Demonstration Tokamak Hybrid Reactor (DTHR)

    International Nuclear Information System (INIS)

    Sink, D.A.; Gibson, G.

    1979-03-01

    The results of an extensive set of parametric studies are presented which provide analytical data of the effects of various tokamak parameters on the performance and cost of the DTHR (Demonstration Tokamak Hybrid Reactor). The studies were centered on a point design which is described in detail. Variations in the device size, neutron wall loading, and plasma aspect ratio are presented, and the effects on direct hardware costs, fissile fuel production (breeding), fusion power production, electrical power consumption, and thermal power production are shown graphically. The studies considered both ignition and beam-driven operations of DTHR and yielded results based on two empirical scaling laws presently used in reactor studies. Sensitivity studies were also made for variations in the following key parameters: the plasma elongation, the minor radius, the TF coil peak field, the neutral beam injection power, and the Z/sub eff/ of the plasma

  1. Analysis of tribological behaviour of zirconia reinforced Al-SiC hybrid composites using statistical and artificial neural network technique

    Science.gov (United States)

    Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.

    2018-05-01

    The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.

  2. Hybrid Task Design: Connecting Learning Opportunities Related to Critical Thinking and Statistical Thinking

    Science.gov (United States)

    Kuntze, Sebastian; Aizikovitsh-Udi, Einav; Clarke, David

    2017-01-01

    Stimulating thinking related to mathematical content is the focus of many tasks in the mathematics classroom. Beyond such content-related thinking, promoting forms of higher order thinking is among the goals of mathematics instruction as well. So-called hybrid tasks focus on combining both goals: they aim at fostering mathematical thinking and…

  3. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  4. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  5. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  6. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Response to traumatic brain injury neurorehabilitation through an artificial intelligence and statistics hybrid knowledge discovery from databases methodology.

    Science.gov (United States)

    Gibert, Karina; García-Rudolph, Alejandro; García-Molina, Alberto; Roig-Rovira, Teresa; Bernabeu, Montse; Tormos, José María

    2008-01-01

    Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.

  8. Subtractive hybridization and random arbitrarily primed PCR analyses of a benzoate-assimilating bacterium, Desulfotignum balticum.

    Science.gov (United States)

    Habe, Hiroshi; Kobuna, Akinori; Hosoda, Akifumi; Kouzuma, Atsushi; Yamane, Hisakazu; Nojiri, Hideaki; Omori, Toshio; Watanabe, Kazuya

    2008-05-01

    Subtractive hybridization (SH) and random arbitrarily primed PCR (RAP-PCR) were used to detect genes involved in anaerobic benzoate degradation by Desulfotignum balticum. Through SH, we obtained 121 DNA sequences specific for D. balticum but not for D. phosphitoxidans (a non-benzoate-assimilating species). Furthermore, RAP-PCR analysis showed that a 651-bp DNA fragment, having 55% homology with the solute-binding protein of the ABC transporter system in Methanosarcina barkeri, was expressed when D. balticum was grown on benzoate, but not on pyruvate. By shotgun sequencing of the fosmid clone (38,071 bp) containing the DNA fragment, 33 open reading frames (ORFs) and two incomplete ORFs were annotated, and several genes within this region corresponded to the DNA fragments obtained by SH. An 11.3-kb gene cluster (ORF10-17) revealed through reverse transcription-PCR showed homology with the ABC transporter system and TonB-dependent receptors, both of which are presumably involved in the uptake of siderophore/heme/vitamin B(12), and was expressed in response to growth on benzoate.

  9. Performance and driveline analyses of engine capacity in range extender engine hybrid vehicle

    Science.gov (United States)

    Praptijanto, Achmad; Santoso, Widodo Budi; Nur, Arifin; Wahono, Bambang; Putrasari, Yanuandri

    2017-01-01

    In this study, range extender engine designed should be able to meet the power needs of a power generator of hybrid electrical vehicle that has a minimum of 18 kW. Using this baseline model, the following range extenders will be compared between conventional SI piston engine (Baseline, BsL), engine capacity 1998 cm3, and efficiency-oriented SI piston with engine capacity 999 cm3 and 499 cm3 with 86 mm bore and stroke square gasoline engine in the performance, emission prediction of range extender engine, standard of charge by using engine and vehicle simulation software tools. In AVL Boost simulation software, range extender engine simulated from 1000 to 6000 rpm engine loads. The highest peak engine power brake reached up to 38 kW at 4500 rpm. On the other hand the highest torque achieved in 100 Nm at 3500 rpm. After that using AVL cruise simulation software, the model of range extended electric vehicle in series configuration with main components such as internal combustion engine, generator, electric motor, battery and the arthemis model rural road cycle was used to simulate the vehicle model. The simulation results show that engine with engine capacity 999 cm3 reported the economical performances of the engine and the emission and the control of engine cycle parameters.

  10. Energy–exergy and economic analyses of a hybrid solar–hydrogen renewable energy system in Ankara, Turkey

    International Nuclear Information System (INIS)

    Ozden, Ender; Tari, Ilker

    2016-01-01

    Highlights: • Uninterrupted energy in an emergency blackout situation. • System modeling of a solar–hydrogen based hybrid renewable energy system. • A comprehensive thermodynamical analysis. • Levelized cost of electricity analysis for a project lifetime of 25 years. - Abstract: A hybrid (Solar–Hydrogen) stand-alone renewable energy system that consists of photovoltaic panels (PV), Proton Exchange Membrane (PEM) fuel cells, PEM based electrolyzers and hydrogen storage is investigated by developing a complete model of the system using TRNSYS. The PV panels are mounted on a tiltable platform to improve the performance of the system by monthly adjustments of the tilt angle. The total area of the PV panels is 300 m 2 , the PEM fuel cell capacity is 5 kW, and the hydrogen storage is at 55 bars pressure and with 45 m 3 capacity. The main goal of this study is to verify that the system meets the electrical power demand of the emergency room without experiencing a shortage for a complete year in an emergency blackout situation. For this purpose, after modeling the system, energy and exergy analyses for the hydrogen cycle of the system for a complete year are performed, and the energy and exergy efficiencies are found as 4.06% and 4.25%, respectively. Furthermore, an economic analysis is performed for a project lifetime of 25 years based on Levelized Cost of Electricity (LCE), and the LCE is calculated as 0.626 $/kWh.

  11. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-06-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts` ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  12. Fluid-structure-interaction analyses of reactor vessel using improved hybrid Lagrangian Eulerian code ALICE-II

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.Y.

    1993-01-01

    This paper describes fluid-structure-interaction and structure response analyses of a reactor vessel subjected to loadings associated with postulated accidents, using the hybrid Lagrangian-Eulerian code ALICE-II. This code has been improved recently to accommodate many features associated with innovative designs of reactor vessels. Calculational capabilities have been developed to treat water in the reactor cavity outside the vessel, internal shield structures and internal thin shells. The objective of the present analyses is to study the cover response and potential for missile generation in response to a fuel-coolant interaction in the core region. Three calculations were performed using the cover weight as a parameter. To study the effect of the cavity water, vessel response calculations for both wet- and dry-cavity designs are compared. Results indicate that for all cases studied and for the design parameters assumed, the calculated cover displacements are all smaller than the bolts' ultimate displacement and no missile generation of the closure head is predicted. Also, solutions reveal that the cavity water of the wet-cavity design plays an important role of restraining the downward displacement of the bottom head. Based on these studies, the analyses predict that the structure integrity is maintained throughout the postulated accident for the wet-cavity design.

  13. Preliminary analyses of neutronics schemes for three kinds waste transmutation blankets of fusion-fission hybrid

    International Nuclear Information System (INIS)

    Zhang Mingchun; Feng Kaiming; Li Zaixin; Zhao Fengchao

    2012-01-01

    The neutronics schemes of the helium-cooled waste transmutation blanket, sodium-cooled waste transmutation blanket and FLiBe-cooled waste transmutation blanket were preliminarily calculated and analysed by using the spheroidal tokamak (ST) plasma configuration. The neutronics properties of these blankets' were compared and analyzed. The results show that for the transmutation of "2"3"7Np, FLiBe-cooled waste transmutation blanket has the most superior transmutation performance. The calculation results of the helium-cooled waste transmutation blanket show that this transmutation blanket can run on a steady effective multiplication factor (k_e_f_f), steady power (P), and steady tritium production rate (TBR) state for a long operating time (9.62 years) by change "2"3"7Np's initial loading rate of the minor actinides (MA). (authors)

  14. Influence of peer review on the reporting of primary outcome(s) and statistical analyses of randomised trials.

    Science.gov (United States)

    Hopewell, Sally; Witt, Claudia M; Linde, Klaus; Icke, Katja; Adedire, Olubusola; Kirtley, Shona; Altman, Douglas G

    2018-01-11

    Selective reporting of outcomes in clinical trials is a serious problem. We aimed to investigate the influence of the peer review process within biomedical journals on reporting of primary outcome(s) and statistical analyses within reports of randomised trials. Each month, PubMed (May 2014 to April 2015) was searched to identify primary reports of randomised trials published in six high-impact general and 12 high-impact specialty journals. The corresponding author of each trial was invited to complete an online survey asking authors about changes made to their manuscript as part of the peer review process. Our main outcomes were to assess: (1) the nature and extent of changes as part of the peer review process, in relation to reporting of the primary outcome(s) and/or primary statistical analysis; (2) how often authors followed these requests; and (3) whether this was related to specific journal or trial characteristics. Of 893 corresponding authors who were invited to take part in the online survey 258 (29%) responded. The majority of trials were multicentre (n = 191; 74%); median sample size 325 (IQR 138 to 1010). The primary outcome was clearly defined in 92% (n = 238), of which the direction of treatment effect was statistically significant in 49%. The majority responded (1-10 Likert scale) they were satisfied with the overall handling (mean 8.6, SD 1.5) and quality of peer review (mean 8.5, SD 1.5) of their manuscript. Only 3% (n = 8) said that the editor or peer reviewers had asked them to change or clarify the trial's primary outcome. However, 27% (n = 69) reported they were asked to change or clarify the statistical analysis of the primary outcome; most had fulfilled the request, the main motivation being to improve the statistical methods (n = 38; 55%) or avoid rejection (n = 30; 44%). Overall, there was little association between authors being asked to make this change and the type of journal, intervention, significance of the

  15. Consumer Loyalty and Loyalty Programs: a topographic examination of the scientific literature using bibliometrics, spatial statistics and network analyses

    Directory of Open Access Journals (Sweden)

    Viviane Moura Rocha

    2015-04-01

    Full Text Available This paper presents a topographic analysis of the fields of consumer loyalty and loyalty programs, vastly studied in the last decades and still relevant in the marketing literature. After the identification of 250 scientific papers that were published in the last ten years in indexed journals, a subset of 76 were chosen and their 3223 references were extracted. The journals in which these papers were published, their key words, abstracts, authors, institutions of origin and citation patterns were identified and analyzed using bibliometrics, spatial statistics techniques and network analyses. The results allow the identification of the central components of the field, as well as its main authors, journals, institutions and countries that intermediate the diffusion of knowledge, which contributes to the understanding of the constitution of the field by researchers and students.

  16. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses.

    Science.gov (United States)

    Buttigieg, Pier Luigi; Ramette, Alban

    2014-12-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.

  17. Hybrid Tasks: Promoting Statistical Thinking and Critical Thinking through the Same Mathematical Activities

    Science.gov (United States)

    Aizikovitsh-Udi, Einav; Clarke, David; Kuntze, Sebastian

    2014-01-01

    Even though statistical thinking and critical thinking appear to have strong links from a theoretical point of view, empirical research into the intersections and potential interrelatedness of these aspects of competence is scarce. Our research suggests that thinking skills in both areas may be interdependent. Given this interconnection, it should…

  18. Ghost-tree: creating hybrid-gene phylogenetic trees for diversity analyses.

    Science.gov (United States)

    Fouquier, Jennifer; Rideout, Jai Ram; Bolyen, Evan; Chase, John; Shiffer, Arron; McDonald, Daniel; Knight, Rob; Caporaso, J Gregory; Kelley, Scott T

    2016-02-24

    Fungi play critical roles in many ecosystems, cause serious diseases in plants and animals, and pose significant threats to human health and structural integrity problems in built environments. While most fungal diversity remains unknown, the development of PCR primers for the internal transcribed spacer (ITS) combined with next-generation sequencing has substantially improved our ability to profile fungal microbial diversity. Although the high sequence variability in the ITS region facilitates more accurate species identification, it also makes multiple sequence alignment and phylogenetic analysis unreliable across evolutionarily distant fungi because the sequences are hard to align accurately. To address this issue, we created ghost-tree, a bioinformatics tool that integrates sequence data from two genetic markers into a single phylogenetic tree that can be used for diversity analyses. Our approach starts with a "foundation" phylogeny based on one genetic marker whose sequences can be aligned across organisms spanning divergent taxonomic groups (e.g., fungal families). Then, "extension" phylogenies are built for more closely related organisms (e.g., fungal species or strains) using a second more rapidly evolving genetic marker. These smaller phylogenies are then grafted onto the foundation tree by mapping taxonomic names such that each corresponding foundation-tree tip would branch into its new "extension tree" child. We applied ghost-tree to graft fungal extension phylogenies derived from ITS sequences onto a foundation phylogeny derived from fungal 18S sequences. Our analysis of simulated and real fungal ITS data sets found that phylogenetic distances between fungal communities computed using ghost-tree phylogenies explained significantly more variance than non-phylogenetic distances. The phylogenetic metrics also improved our ability to distinguish small differences (effect sizes) between microbial communities, though results were similar to non

  19. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    Science.gov (United States)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  20. Hybridization between two cryptic filamentous brown seaweeds along the shore: analysing pre- and postzygotic barriers in populations of individuals with varying ploidy levels.

    Science.gov (United States)

    Montecinos, Alejandro E; Guillemin, Marie-Laure; Couceiro, Lucia; Peters, Akira F; Stoeckel, Solenn; Valero, Myriam

    2017-07-01

    We aimed to study the importance of hybridization between two cryptic species of the genus Ectocarpus, a group of filamentous algae with haploid-diploid life cycles that include the principal genetic model organism for the brown algae. In haploid-diploid species, the genetic structure of the two phases of the life cycle can be analysed separately in natural populations. Such life cycles provide a unique opportunity to estimate the frequency of hybrid genotypes in diploid sporophytes and meiotic recombinant genotypes in haploid gametophytes allowing the effects of reproductive barriers preventing fertilization or preventing meiosis to be untangle. The level of hybridization between E. siliculosus and E. crouaniorum was quantified along the European coast. Clonal cultures (568 diploid, 336 haploid) isolated from field samples were genotyped using cytoplasmic and nuclear markers to estimate the frequency of hybrid genotypes in diploids and recombinant haploids. We identified admixed individuals using microsatellite loci, classical assignment methods and a newly developed Bayesian method (XPloidAssignment), which allows the analysis of populations that exhibit variations in ploidy level. Over all populations, the level of hybridization was estimated at 8.7%. Hybrids were exclusively observed in sympatric populations. More than 98% of hybrids were diploids (40% of which showed signs of aneuploidy) with a high frequency of rare alleles. The near absence of haploid recombinant hybrids demonstrates that the reproductive barriers are mostly postzygotic and suggests that abnormal chromosome segregation during meiosis following hybridization of species with different genome sizes could be a major cause of interspecific incompatibility in this system. © 2017 John Wiley & Sons Ltd.

  1. Genetic basis for spontaneous hybrid genome doubling during allopolyploid speciation of common wheat shown by natural variation analyses of the paternal species.

    Directory of Open Access Journals (Sweden)

    Yoshihiro Matsuoka

    Full Text Available The complex process of allopolyploid speciation includes various mechanisms ranging from species crosses and hybrid genome doubling to genome alterations and the establishment of new allopolyploids as persisting natural entities. Currently, little is known about the genetic mechanisms that underlie hybrid genome doubling, despite the fact that natural allopolyploid formation is highly dependent on this phenomenon. We examined the genetic basis for the spontaneous genome doubling of triploid F1 hybrids between the direct ancestors of allohexaploid common wheat (Triticum aestivum L., AABBDD genome, namely Triticumturgidum L. (AABB genome and Aegilopstauschii Coss. (DD genome. An Ae. tauschii intraspecific lineage that is closely related to the D genome of common wheat was identified by population-based analysis. Two representative accessions, one that produces a high-genome-doubling-frequency hybrid when crossed with a T. turgidum cultivar and the other that produces a low-genome-doubling-frequency hybrid with the same cultivar, were chosen from that lineage for further analyses. A series of investigations including fertility analysis, immunostaining, and quantitative trait locus (QTL analysis showed that (1 production of functional unreduced gametes through nonreductional meiosis is an early step key to successful hybrid genome doubling, (2 first division restitution is one of the cytological mechanisms that cause meiotic nonreduction during the production of functional male unreduced gametes, and (3 six QTLs in the Ae. tauschii genome, most of which likely regulate nonreductional meiosis and its subsequent gamete production processes, are involved in hybrid genome doubling. Interlineage comparisons of Ae. tauschii's ability to cause hybrid genome doubling suggested an evolutionary model for the natural variation pattern of the trait in which non-deleterious mutations in six QTLs may have important roles. The findings of this study demonstrated

  2. Phylogenetic analyses suggest a hybrid origin of the figs (Moraceae: Ficus) that are endemic to the Ogasawara (Bonin) Islands, Japan.

    Science.gov (United States)

    Kusumi, Junko; Azuma, Hiroshi; Tzeng, Hsy-Yu; Chou, Lien-Siang; Peng, Yan-Qiong; Nakamura, Keiko; Su, Zhi-Hui

    2012-04-01

    The Ogasawara Islands are oceanic islands and harbor a unique endemic flora. There are three fig species (Ficus boninsimae, F. nishimurae and F. iidaiana) endemic to the Ogasawara Islands, and these species have been considered to be closely related to Ficus erecta, and to have diverged within the islands. However, this hypothesis remains uncertain. To investigate this issue, we assessed the phylogenetic relationships of the Ogasawara figs and their close relatives occurring in Japan, Taiwan and South China based on six plastid genome regions, nuclear ITS region and two nuclear genes. The plastid genome-based tree indicated a close relationship between the Ogasawara figs and F. erecta, whereas some of the nuclear gene-based trees suggested this relationship was not so close. In addition, the phylogenetic analyses of the pollinating wasps associated with these fig species based on the nuclear 28S rRNA and mitochondrial cytB genes suggested that the fig-pollinating wasps of F. erecta are not sister to those of the Ogasawara figs These results suggest the occurrence of an early hybridization event(s) in the lineage leading to the Ogasawara figs. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Analyses of large quasistatic deformations of inelastic bodies by a new hybrid-stress finite element algorithm

    Science.gov (United States)

    Reed, K. W.; Atluri, S. N.

    1983-01-01

    A new hybrid-stress finite element algorithm, suitable for analyses of large, quasistatic, inelastic deformations, is presented. The algorithm is base upon a generalization of de Veubeke's complementary energy principle. The principal variables in the formulation are the nominal stress rate and spin, and thg resulting finite element equations are discrete versions of the equations of compatibility and angular momentum balance. The algorithm produces true rates, time derivatives, as opposed to 'increments'. There results a complete separation of the boundary value problem (for stress rate and velocity) and the initial value problem (for total stress and deformation); hence, their numerical treatments are essentially independent. After a fairly comprehensive discussion of the numerical treatment of the boundary value problem, we launch into a detailed examination of the numerical treatment of the initial value problem, covering the topics of efficiency, stability and objectivity. The paper is closed with a set of examples, finite homogeneous deformation problems, which serve to bring out important aspects of the algorithm.

  4. A New Triangular Hybrid Displacement Function Element for Static and Free Vibration Analyses of Mindlin-Reissner Plate

    Directory of Open Access Journals (Sweden)

    Jun-Bin Huang

    Full Text Available Abstract A new 3-node triangular hybrid displacement function Mindlin-Reissner plate element is developed. Firstly, the modified variational functional of complementary energy for Mindlin-Reissner plate, which is eventually expressed by a so-called displacement function F, is proposed. Secondly, the locking-free formulae of Timoshenko’s beam theory are chosen as the deflection, rotation, and shear strain along each element boundary. Thirdly, seven fundamental analytical solutions of the displacement function F are selected as the trial functions for the assumed resultant fields, so that the assumed resultant fields satisfy all governing equations in advance. Finally, the element stiffness matrix of the new element, denoted by HDF-P3-7β, is derived from the modified principle of complementary energy. Together with the diagonal inertia matrix of the 3-node triangular isoparametric element, the proposed element is also successfully generalized to the free vibration problems. Numerical results show that the proposed element exhibits overall remarkable performance in all benchmark problems, especially in the free vibration analyses.

  5. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    Science.gov (United States)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  6. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  7. MITOCHONDRIAL DNA ANALYSES AND THE ORIGIN AND RELATIVE AGE OF PARTHENOGENETIC CNEMIDOPHORUS: PHYLOGENETIC CONSTRAINTS ON HYBRID ORIGINS.

    Science.gov (United States)

    Moritz, C; Wright, J W; Brown, W M

    1992-02-01

    Within the genus Cnemidophorus, parthenogenesis has arisen by hybridization several times. This provides the opportunity to investigate general features of hybridization events that result in the formation of parthenogenetic lineages. The relationships of mtDNA from all bisexual species of Cnemidophorus known to be parents of parthenogens were investigated to evaluate phylogenetic constraints on the hybrid-origin of parthenogenesis. No phylogenetic clustering of the parental species, either maternal or paternal, was apparent. However, the combination of bisexual species that have resulted in parthenogenetic lineages are generally distantly related or genetically divergent. This contrasts with the expectation if parthenogenesis in hybrids is due to the action of a single rare allele, but is consistent with the hypothesis that some minimal level of divergence is necessary to stimulate parthenogenetic reproduction in hybrids. © 1992 The Society for the Study of Evolution.

  8. Hybrid utilization of solar energy. Part 2. Performance analyses of heating system with air hybrid collector; Taiyo energy no hybrid riyo ni kansuru kenkyu. 2. Kuki shunetsu hybrid collector wo mochiita danbo system no seino hyoka

    Energy Technology Data Exchange (ETDEWEB)

    Yoshinaga, M; Okumiya, M [Nagoya University, Nagoya (Japan)

    1996-10-27

    For the effective utilization of solar energy at houses, a heating system using an air hybrid collector (capable of simultaneously performing heat collection and photovoltaic power generation). As the specimen house, a wooden house of a total floor area of 120m{sup 2} was simulated. Collected air is fanned into a crushed stone heat accumulator (capable of storing one day`s collection) or into a living room. The output of solar cell arrays is put into a heat pump (capable of handling a maximum hourly load of 36,327kJ/h) via an inverter so as to drive the fan (corresponding to average insolation on the heat collecting plate of 10.7MJ/hm{sup 2} and heat collecting efficiency of 40%), and shortage in power if any is supplied from the system interconnection. A hybrid collector, as compared with the conventional air collector, is lower in thermal efficiency but the merit that it exhibits with respect to power generation is far greater than what is needed to counterbalance the demerit. When the hybrid system is in heating operation, there is an ideal heat cycle of collection, accumulation, and radiation when the load is light, but the balance between accumulation and radiation is disturbed when the load is heavy. 4 refs., 8 figs., 3 tabs.

  9. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses

    International Nuclear Information System (INIS)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.

    2009-01-01

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  10. The effects of clinical and statistical heterogeneity on the predictive values of results from meta-analyses

    NARCIS (Netherlands)

    Melsen, W G; Rovers, M M; Bonten, M J M; Bootsma, M C J|info:eu-repo/dai/nl/304830305

    Variance between studies in a meta-analysis will exist. This heterogeneity may be of clinical, methodological or statistical origin. The last of these is quantified by the I(2) -statistic. We investigated, using simulated studies, the accuracy of I(2) in the assessment of heterogeneity and the

  11. Giant Galápagos tortoises; molecular genetic analyses identify a trans-island hybrid in a repatriation program of an endangered taxon

    Directory of Open Access Journals (Sweden)

    Caccone Adalgisa

    2007-02-01

    Full Text Available Abstract Background Giant Galápagos tortoises on the island of Española have been the focus of an intensive captive breeding-repatriation programme for over 35 years that saved the taxon from extinction. However, analysis of 118 samples from released individuals indicated that the bias sex ratio and large variance in reproductive success among the 15 breeders has severely reduced the effective population size (Ne. Results We report here that an analysis of an additional 473 captive-bred tortoises released back to the island reveals an individual (E1465 that exhibits nuclear microsatellite alleles not found in any of the 15 breeders. Statistical analyses incorporating genotypes of 304 field-sampled individuals from all populations on the major islands indicate that E1465 is most probably a hybrid between an Española female tortoise and a male from the island of Pinzón, likely present on Española due to human transport. Conclusion Removal of E1465 as well as its father and possible (half-siblings is warranted to prevent further contamination within this taxon of particular conservation significance. Despite this detected single contamination, it is highly noteworthy to emphasize the success of this repatriation program conducted over nearly 40 years and involving release of over 2000 captive-bred tortoises that now reproduce in situ. The incorporation of molecular genetic analysis of the program is providing guidance that will aid in monitoring the genetic integrity of this ambitious effort to restore a unique linage of a spectacular animal.

  12. Hybridization Capture Using RAD Probes (hyRAD, a New Tool for Performing Genomic Analyses on Collection Specimens.

    Directory of Open Access Journals (Sweden)

    Tomasz Suchan

    Full Text Available In the recent years, many protocols aimed at reproducibly sequencing reduced-genome subsets in non-model organisms have been published. Among them, RAD-sequencing is one of the most widely used. It relies on digesting DNA with specific restriction enzymes and performing size selection on the resulting fragments. Despite its acknowledged utility, this method is of limited use with degraded DNA samples, such as those isolated from museum specimens, as these samples are less likely to harbor fragments long enough to comprise two restriction sites making possible ligation of the adapter sequences (in the case of double-digest RAD or performing size selection of the resulting fragments (in the case of single-digest RAD. Here, we address these limitations by presenting a novel method called hybridization RAD (hyRAD. In this approach, biotinylated RAD fragments, covering a random fraction of the genome, are used as baits for capturing homologous fragments from genomic shotgun sequencing libraries. This simple and cost-effective approach allows sequencing of orthologous loci even from highly degraded DNA samples, opening new avenues of research in the field of museum genomics. Not relying on the restriction site presence, it improves among-sample loci coverage. In a trial study, hyRAD allowed us to obtain a large set of orthologous loci from fresh and museum samples from a non-model butterfly species, with a high proportion of single nucleotide polymorphisms present in all eight analyzed specimens, including 58-year-old museum samples. The utility of the method was further validated using 49 museum and fresh samples of a Palearctic grasshopper species for which the spatial genetic structure was previously assessed using mtDNA amplicons. The application of the method is eventually discussed in a wider context. As it does not rely on the restriction site presence, it is therefore not sensitive to among-sample loci polymorphisms in the restriction sites

  13. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  14. Adjusting the Adjusted X[superscript 2]/df Ratio Statistic for Dichotomous Item Response Theory Analyses: Does the Model Fit?

    Science.gov (United States)

    Tay, Louis; Drasgow, Fritz

    2012-01-01

    Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…

  15. Performance analyses of a hybrid geothermal–fossil power generation system using low-enthalpy geothermal resources

    International Nuclear Information System (INIS)

    Liu, Qiang; Shang, Linlin; Duan, Yuanyuan

    2016-01-01

    Highlights: • Geothermal energy is used to preheat the feedwater in a coal-fired power unit. • The performance of a hybrid geothermal–fossil power generation system is analyzed. • Models for both parallel and serial geothermal preheating schemes are presented. • Effects of geothermal source temperatures, distances and heat losses are analyzed. • Power increase of the hybrid system over an ORC and tipping distance are discussed. - Abstract: Low-enthalpy geothermal heat can be efficiently utilized for feedwater preheating in coal-fired power plants by replacing some of the high-grade steam that can then be used to generate more power. This study analyzes a hybrid geothermal–fossil power generation system including a supercritical 1000 MW power unit and a geothermal feedwater preheating system. This study models for parallel and serial geothermal preheating schemes and analyzes the thermodynamic performance of the hybrid geothermal–fossil power generation system for various geothermal resource temperatures. The models are used to analyze the effects of the temperature matching between the geothermal water and the feedwater, the heat losses and pumping power during the geothermal water transport and the resource distance and temperature on the power increase to improve the power generation. The serial geothermal preheating (SGP) scheme generally generates more additional power than the parallel geothermal preheating (PGP) scheme for geothermal resource temperatures of 100–130 °C, but the SGP scheme generates slightly less additional power than the PGP scheme when the feedwater is preheated to as high a temperature as possible before entering the deaerator for geothermal resource temperatures higher than 140 °C. The additional power decreases as the geothermal source distance increases since the pipeline pumping power increases and the geothermal water temperature decreases due to heat losses. More than 50% of the power decrease is due to geothermal

  16. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Science.gov (United States)

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  17. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  18. Analyses of statistical transformations of row data describing free proline concentration in sugar beet exposed to drought

    Directory of Open Access Journals (Sweden)

    Putnik-Delić Marina I.

    2010-01-01

    Full Text Available Eleven sugar beet genotypes were tested for their capacity to tolerate drought. Plants were grown in semi-controlled conditions, in the greenhouse, and watered daily. After 90 days, water deficit was imposed by the cessation of watering, while the control plants continued to be watered up to 80% of FWC. Five days later concentration of free proline in leaves was determined. Analysis was done in three replications. Statistical analysis was performed using STATISTICA 9.0, Minitab 15, and R2.11.1. Differences between genotypes were statistically processed by Duncan test. Because of nonormality of the data distribution and heterogeneity of variances in different groups, two types of transformations of row data were applied. For this type of data more appropriate in eliminating nonormality was Johnson transformation, as opposed to Box-Cox. Based on the both transformations it may be concluded that in all genotypes except for 10, concentration of free proline differs significantly between treatment (drought and the control.

  19. Point processes statistics of stable isotopes: analysing water uptake patterns in a mixed stand of Aleppo pine and Holm oak

    Directory of Open Access Journals (Sweden)

    Carles Comas

    2015-04-01

    Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.

  20. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    Science.gov (United States)

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were

  1. Does bisphenol A induce superfeminization in Marisa cornuarietis? Part II: toxicity test results and requirements for statistical power analyses.

    Science.gov (United States)

    Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert

    2007-03-01

    This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.

  2. Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical approach to Mineral Image Analysis

    Science.gov (United States)

    Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino

    2018-06-01

    We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

  3. Influence of Immersion Conditions on The Tensile Strength of Recycled Kevlar®/Polyester/Low-Melting-Point Polyester Nonwoven Geotextiles through Applying Statistical Analyses

    Directory of Open Access Journals (Sweden)

    Jing-Chzi Hsieh

    2016-05-01

    Full Text Available The recycled Kevlar®/polyester/low-melting-point polyester (recycled Kevlar®/PET/LPET nonwoven geotextiles are immersed in neutral, strong acid, and strong alkali solutions, respectively, at different temperatures for four months. Their tensile strength is then tested according to various immersion periods at various temperatures, in order to determine their durability to chemicals. For the purpose of analyzing the possible factors that influence mechanical properties of geotextiles under diverse environmental conditions, the experimental results and statistical analyses are incorporated in this study. Therefore, influences of the content of recycled Kevlar® fibers, implementation of thermal treatment, and immersion periods on the tensile strength of recycled Kevlar®/PET/LPET nonwoven geotextiles are examined, after which their influential levels are statistically determined by performing multiple regression analyses. According to the results, the tensile strength of nonwoven geotextiles can be enhanced by adding recycled Kevlar® fibers and thermal treatment.

  4. GISH and AFLP analyses of novel Brassica napus lines derived from one hybrid between B. napus and Orychophragmus violaceus.

    Science.gov (United States)

    Ma, Ni; Li, Zai-Yun; Cartagena, J A; Fukui, K

    2006-10-01

    New Brassica napus inbred lines with different petal colors and with canola quality and increased levels of oleic (approximately 70%, 10% higher than that of B. napus parent) and linoleic (28%) acids have been developed in the progenies of one B. napus cv. Oro x Orychophragmus violaceus F5 hybrid plant (2n = 31). Their genetic constituents were analyzed by using the methods of genomic in situ hybridization (GISH) and amplified fragments length polymorphism (AFLP). No intact chromosomes of O. violaceus origin were detected by GISH in their somatic cells of ovaries and root tips (2n = 38) and pollen mother cells (PMCs) with normal chromosome pairing (19 bivalents) and segregation (19:19), though signals of variable sizes and intensities were located mainly at terminal and centromeric parts of some mitotic chromosomes and meiotic bivalents at diakinesis or chromosomes in anaphase I groups and one large patch of chromatin was intensively labeled and separated spatially in some telophase I nuclei and metaphase II PMCs. AFLP analysis revealed that substantial genomic changes have occurred in these lines and O. violaceus-specific bands, deleted bands in 'Oro' and novel bands for two parents were detected. The possible mechanisms for these results were discussed.

  5. Feasibility Analyses of Developing Low Carbon City with Hybrid Energy Systems in China: The Case of Shenzhen

    Directory of Open Access Journals (Sweden)

    Xun Zhang

    2016-05-01

    Full Text Available As the largest carbon emission source in China, the power sector grows rapidly owing to the country’s unprecedented urbanization and industrialization processes. In order to explore a low carbon urbanization pathway by reducing carbon emissions of the power sector, the Chinese government launched an international low carbon city (ILCC project in Shenzhen. This paper presents a feasibility analysis on the potential hybrid energy system based on local renewable energy resources and electricity demand estimation over the three planning stages of the ILCC project. Wind power, solar power, natural gas and the existing power grid are components considered in the hybrid energy system. The simulation results indicate that the costs of energy in the three planning stages are 0.122, 0.105 and 0.141 $/kWh, respectively, if external wind farms and pumped storage hydro stations (PSHSs exist. The optimization results reveal that the carbon reduction rates are 46.81%, 62.99% and 75.76% compared with the Business as Usual scenarios. The widely distributed water reservoirs in Shenzhen provide ideal conditions to construct PSHS, which is crucial in enhancing renewable energy utilization.

  6. Examining reproducibility in psychology : A hybrid method for combining a statistically significant original study and a replication

    NARCIS (Netherlands)

    Van Aert, R.C.M.; Van Assen, M.A.L.M.

    2018-01-01

    The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter

  7. Statistical and molecular analyses of evolutionary significance of red-green color vision and color blindness in vertebrates.

    Science.gov (United States)

    Yokoyama, Shozo; Takenaka, Naomi

    2005-04-01

    Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.

  8. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    Science.gov (United States)

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.

  9. Ultimate compression after impact load prediction in graphite/epoxy coupons using neural network and multivariate statistical analyses

    Science.gov (United States)

    Gregoire, Alexandre David

    2011-07-01

    The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.

  10. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    International Nuclear Information System (INIS)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo

    2015-01-01

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  11. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  12. Statistical evaluation of the performance of gridded monthly precipitation products from reanalysis data, satellite estimates, and merged analyses over China

    Science.gov (United States)

    Deng, Xueliang; Nie, Suping; Deng, Weitao; Cao, Weihua

    2018-04-01

    In this study, we compared the following four different gridded monthly precipitation products: the National Centers for Environmental Prediction version 2 (NCEP-2) reanalysis data, the satellite-based Climate Prediction Center Morphing technique (CMORPH) data, the merged satellite-gauge Global Precipitation Climatology Project (GPCP) data, and the merged satellite-gauge-model data from the Beijing Climate Center Merged Estimation of Precipitation (BMEP). We evaluated the performances of these products using monthly precipitation observations spanning the period of January 2003 to December 2013 from a dense, national, rain gauge network in China. Our assessment involved several statistical techniques, including spatial pattern, temporal variation, bias, root-mean-square error (RMSE), and correlation coefficient (CC) analysis. The results show that NCEP-2, GPCP, and BMEP generally overestimate monthly precipitation at the national scale and CMORPH underestimates it. However, all of the datasets successfully characterized the northwest to southeast increase in the monthly precipitation over China. Because they include precipitation gauge information from the Global Telecommunication System (GTS) network, GPCP and BMEP have much smaller biases, lower RMSEs, and higher CCs than NCEP-2 and CMORPH. When the seasonal and regional variations are considered, NCEP-2 has a larger error over southern China during the summer. CMORPH poorly reproduces the magnitude of the precipitation over southeastern China and the temporal correlation over western and northwestern China during all seasons. BMEP has a lower RMSE and higher CC than GPCP over eastern and southern China, where the station network is dense. In contrast, BMEP has a lower CC than GPCP over western and northwestern China, where the gauge network is relatively sparse.

  13. Combining the Power of Statistical Analyses and Community Interviews to Identify Adoption Barriers for Stormwater Best-Management Practices

    Science.gov (United States)

    Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.

    2015-12-01

    Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.

  14. Statistical properties of interval mapping methods on quantitative trait loci location: impact on QTL/eQTL analyses

    Directory of Open Access Journals (Sweden)

    Wang Xiaoqiang

    2012-04-01

    Full Text Available Abstract Background Quantitative trait loci (QTL detection on a huge amount of phenotypes, like eQTL detection on transcriptomic data, can be dramatically impaired by the statistical properties of interval mapping methods. One of these major outcomes is the high number of QTL detected at marker locations. The present study aims at identifying and specifying the sources of this bias, in particular in the case of analysis of data issued from outbred populations. Analytical developments were carried out in a backcross situation in order to specify the bias and to propose an algorithm to control it. The outbred population context was studied through simulated data sets in a wide range of situations. The likelihood ratio test was firstly analyzed under the "one QTL" hypothesis in a backcross population. Designs of sib families were then simulated and analyzed using the QTL Map software. On the basis of the theoretical results in backcross, parameters such as the population size, the density of the genetic map, the QTL effect and the true location of the QTL, were taken into account under the "no QTL" and the "one QTL" hypotheses. A combination of two non parametric tests - the Kolmogorov-Smirnov test and the Mann-Whitney-Wilcoxon test - was used in order to identify the parameters that affected the bias and to specify how much they influenced the estimation of QTL location. Results A theoretical expression of the bias of the estimated QTL location was obtained for a backcross type population. We demonstrated a common source of bias under the "no QTL" and the "one QTL" hypotheses and qualified the possible influence of several parameters. Simulation studies confirmed that the bias exists in outbred populations under both the hypotheses of "no QTL" and "one QTL" on a linkage group. The QTL location was systematically closer to marker locations than expected, particularly in the case of low QTL effect, small population size or low density of markers, i

  15. Identification of novel risk factors for community-acquired Clostridium difficile infection using spatial statistics and geographic information system analyses.

    Directory of Open Access Journals (Sweden)

    Deverick J Anderson

    Full Text Available The rate of community-acquired Clostridium difficile infection (CA-CDI is increasing. While receipt of antibiotics remains an important risk factor for CDI, studies related to acquisition of C. difficile outside of hospitals are lacking. As a result, risk factors for exposure to C. difficile in community settings have been inadequately studied.To identify novel environmental risk factors for CA-CDI.We performed a population-based retrospective cohort study of patients with CA-CDI from 1/1/2007 through 12/31/2014 in a 10-county area in central North Carolina. 360 Census Tracts in these 10 counties were used as the demographic Geographic Information System (GIS base-map. Longitude and latitude (X, Y coordinates were generated from patient home addresses and overlaid to Census Tracts polygons using ArcGIS; ArcView was used to assess "hot-spots" or clusters of CA-CDI. We then constructed a mixed hierarchical model to identify environmental variables independently associated with increased rates of CA-CDI.A total of 1,895 unique patients met our criteria for CA-CDI. The mean patient age was 54.5 years; 62% were female and 70% were Caucasian. 402 (21% patient addresses were located in "hot spots" or clusters of CA-CDI (p<0.001. "Hot spot" census tracts were scattered throughout the 10 counties. After adjusting for clustering and population density, age ≥ 60 years (p = 0.03, race (<0.001, proximity to a livestock farm (0.01, proximity to farming raw materials services (0.02, and proximity to a nursing home (0.04 were independently associated with increased rates of CA-CDI.Our study is the first to use spatial statistics and mixed models to identify important environmental risk factors for acquisition of C. difficile and adds to the growing evidence that farm practices may put patients at risk for important drug-resistant infections.

  16. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  17. A multi-criteria evaluation system for marine litter pollution based on statistical analyses of OSPAR beach litter monitoring time series.

    Science.gov (United States)

    Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael

    2013-12-01

    During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Energy and exergy analyses on a novel hybrid solar heating, cooling and power generation system for remote areas

    International Nuclear Information System (INIS)

    Zhai, H.; Dai, Y.J.; Wu, J.Y.; Wang, R.Z.

    2009-01-01

    In this study, a small scale hybrid solar heating, chilling and power generation system, including parabolic trough solar collector with cavity receiver, a helical screw expander and silica gel-water adsorption chiller, etc., was proposed and extensively investigated. The system has the merits of effecting the power generation cycle at lower temperature level with solar energy more efficiently and can provide both thermal energy and power for remote off-grid regions. A case study was carried out to evaluate an annual energy and exergy efficiency of the system under the climate of northwestern region of China. It is found that both the main energy and exergy loss take place at the parabolic trough collector, amount to 36.2% and 70.4%, respectively. Also found is that the studied system can have a higher solar energy conversion efficiency than the conventional solar thermal power generation system alone. The energy efficiency can be increased to 58.0% from 10.2%, and the exergy efficiency can be increased to 15.2% from 12.5%. Moreover, the economical analysis in terms of cost and payback period (PP) has been carried out. The study reveals that the proposed system the PP of the proposed system is about 18 years under present energy price conditions. The sensitivity analysis shows that if the interest rate decreases to 3% or energy price increase by 50%, PP will be less than 10 years. (author)

  19. Statistical parametric mapping and statistical probabilistic anatomical mapping analyses of basal/acetazolamide Tc-99m ECD brain SPECT for efficacy assessment of endovascular stent placement for middle cerebral artery stenosis

    International Nuclear Information System (INIS)

    Lee, Tae-Hong; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Kyung-Pil

    2007-01-01

    Statistical parametric mapping (SPM) and statistical probabilistic anatomical mapping (SPAM) were applied to basal/acetazolamide Tc-99m ECD brain perfusion SPECT images in patients with middle cerebral artery (MCA) stenosis to assess the efficacy of endovascular stenting of the MCA. Enrolled in the study were 11 patients (8 men and 3 women, mean age 54.2 ± 6.2 years) who had undergone endovascular stent placement for MCA stenosis. Using SPM and SPAM analyses, we compared the number of significant voxels and cerebral counts in basal and acetazolamide SPECT images before and after stenting, and assessed the perfusion changes and cerebral vascular reserve index (CVRI). The numbers of hypoperfusion voxels in SPECT images were decreased from 10,083 ± 8,326 to 4,531 ± 5,091 in basal images (P 0.0317) and from 13,398 ± 14,222 to 7,699 ± 10,199 in acetazolamide images (P = 0.0142) after MCA stenting. On SPAM analysis, the increases in cerebral counts were significant in acetazolamide images (90.9 ± 2.2 to 93.5 ± 2.3, P = 0.0098) but not in basal images (91 ± 2.7 to 92 ± 2.6, P = 0.1602). The CVRI also showed a statistically significant increase from before stenting (median 0.32; 95% CI -2.19-2.37) to after stenting (median 1.59; 95% CI -0.85-4.16; P = 0.0068). This study revealed the usefulness of voxel-based analysis of basal/acetazolamide brain perfusion SPECT after MCA stent placement. This study showed that SPM and SPAM analyses of basal/acetazolamide Tc-99m brain SPECT could be used to evaluate the short-term hemodynamic efficacy of successful MCA stent placement. (orig.)

  20. Analysing the spatial patterns of livestock anthrax in Kazakhstan in relation to environmental factors: a comparison of local (Gi* and morphology cluster statistics

    Directory of Open Access Journals (Sweden)

    Ian T. Kracalik

    2012-11-01

    Full Text Available We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle and small (sheep and goats domestic ruminants across Kazakhstan. The Getis-Ord (Gi* statistic and a multidirectional optimal ecotope algorithm (AMOEBA were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149 and for small ruminants (n = 9. In contrast, Gi* revealed fewer large ruminant clusters (n = 122 and more small ruminant clusters (n = 61. Significant environmental differences were found between groups using the Kruskall-Wallis and Mann- Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.

  1. Inferring the origin of rare fruit distillates from compositional data using multivariate statistical analyses and the identification of new flavour constituents.

    Science.gov (United States)

    Mihajilov-Krstev, Tatjana M; Denić, Marija S; Zlatković, Bojan K; Stankov-Jovanović, Vesna P; Mitić, Violeta D; Stojanović, Gordana S; Radulović, Niko S

    2015-04-01

    In Serbia, delicatessen fruit alcoholic drinks are produced from autochthonous fruit-bearing species such as cornelian cherry, blackberry, elderberry, wild strawberry, European wild apple, European blueberry and blackthorn fruits. There are no chemical data on many of these and herein we analysed volatile minor constituents of these rare fruit distillates. Our second goal was to determine possible chemical markers of these distillates through a statistical/multivariate treatment of the herein obtained and previously reported data. Detailed chemical analyses revealed a complex volatile profile of all studied fruit distillates with 371 identified compounds. A number of constituents were recognised as marker compounds for a particular distillate. Moreover, 33 of them represent newly detected flavour constituents in alcoholic beverages or, in general, in foodstuffs. With the aid of multivariate analyses, these volatile profiles were successfully exploited to infer the origin of raw materials used in the production of these spirits. It was also shown that all fruit distillates possessed weak antimicrobial properties. It seems that the aroma of these highly esteemed wild-fruit spirits depends on the subtle balance of various minor volatile compounds, whereby some of them are specific to a certain type of fruit distillate and enable their mutual distinction. © 2014 Society of Chemical Industry.

  2. Analyses on interaction of internal and external surface cracks in a pressurized cylinder by hybrid boundary element method

    International Nuclear Information System (INIS)

    Chai Guozhong; Fang Zhimin; Jiang Xianfeng; Li Gan

    2004-01-01

    This paper presents a comprehensive range of analyses on the interaction of two identical semi-elliptical surface cracks at the internal and external surfaces of a pressurized cylinder. The considered ratios of the crack depth to crack length are b/a=0.25, 0.5, 0.75 and 1.0; the ratios of the crack depth to wall thickness of the cylinder are 2b/t=0.2, 0.4, 0.6, 0.7 and 0.8. Forty crack configurations are analyzed and the stress intensity factors along the crack front are presented. The numerical results show that for 2b/t<0.7, the interaction leads to a decrease in the stress intensity factors for both internal and external surface cracks, compared with a single internal or external surface crack. Thus for fracture analysis of a practical pressurized cylinder with two identical semi-elliptical surface cracks at its internal and external surfaces, a conservative result is obtained by ignoring the interaction

  3. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    Science.gov (United States)

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We

  4. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    Directory of Open Access Journals (Sweden)

    Michael Robert Cunningham

    2016-10-01

    Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.

  5. Research Pearls: The Significance of Statistics and Perils of Pooling. Part 3: Pearls and Pitfalls of Meta-analyses and Systematic Reviews.

    Science.gov (United States)

    Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman

    2017-08-01

    Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations

  6. Categorization of the trophic status of a hydroelectric power plant reservoir in the Brazilian Amazon by statistical analyses and fuzzy approaches.

    Science.gov (United States)

    da Costa Lobato, Tarcísio; Hauser-Davis, Rachel Ann; de Oliveira, Terezinha Ferreira; Maciel, Marinalva Cardoso; Tavares, Maria Regina Madruga; da Silveira, Antônio Morais; Saraiva, Augusto Cesar Fonseca

    2015-02-15

    The Amazon area has been increasingly suffering from anthropogenic impacts, especially due to the construction of hydroelectric power plant reservoirs. The analysis and categorization of the trophic status of these reservoirs are of interest to indicate man-made changes in the environment. In this context, the present study aimed to categorize the trophic status of a hydroelectric power plant reservoir located in the Brazilian Amazon by constructing a novel Water Quality Index (WQI) and Trophic State Index (TSI) for the reservoir using major ion concentrations and physico-chemical water parameters determined in the area and taking into account the sampling locations and the local hydrological regimes. After applying statistical analyses (factor analysis and cluster analysis) and establishing a rule base of a fuzzy system to these indicators, the results obtained by the proposed method were then compared to the generally applied Carlson and a modified Lamparelli trophic state index (TSI), specific for trophic regions. The categorization of the trophic status by the proposed fuzzy method was shown to be more reliable, since it takes into account the specificities of the study area, while the Carlson and Lamparelli TSI do not, and, thus, tend to over or underestimate the trophic status of these ecosystems. The statistical techniques proposed and applied in the present study, are, therefore, relevant in cases of environmental management and policy decision-making processes, aiding in the identification of the ecological status of water bodies. With this, it is possible to identify which factors should be further investigated and/or adjusted in order to attempt the recovery of degraded water bodies. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Region-of-interest analyses of one-dimensional biomechanical trajectories: bridging 0D and 1D theory, augmenting statistical power

    Directory of Open Access Journals (Sweden)

    Todd C. Pataky

    2016-11-01

    Full Text Available One-dimensional (1D kinematic, force, and EMG trajectories are often analyzed using zero-dimensional (0D metrics like local extrema. Recently whole-trajectory 1D methods have emerged in the literature as alternatives. Since 0D and 1D methods can yield qualitatively different results, the two approaches may appear to be theoretically distinct. The purposes of this paper were (a to clarify that 0D and 1D approaches are actually just special cases of a more general region-of-interest (ROI analysis framework, and (b to demonstrate how ROIs can augment statistical power. We first simulated millions of smooth, random 1D datasets to validate theoretical predictions of the 0D, 1D and ROI approaches and to emphasize how ROIs provide a continuous bridge between 0D and 1D results. We then analyzed a variety of public datasets to demonstrate potential effects of ROIs on biomechanical conclusions. Results showed, first, that a priori ROI particulars can qualitatively affect the biomechanical conclusions that emerge from analyses and, second, that ROIs derived from exploratory/pilot analyses can detect smaller biomechanical effects than are detectable using full 1D methods. We recommend regarding ROIs, like data filtering particulars and Type I error rate, as parameters which can affect hypothesis testing results, and thus as sensitivity analysis tools to ensure arbitrary decisions do not influence scientific interpretations. Last, we describe open-source Python and MATLAB implementations of 1D ROI analysis for arbitrary experimental designs ranging from one-sample t tests to MANOVA.

  8. The use of mass spectrometry for analysing metabolite biomarkers in epidemiology: methodological and statistical considerations for application to large numbers of biological samples.

    Science.gov (United States)

    Lind, Mads V; Savolainen, Otto I; Ross, Alastair B

    2016-08-01

    Data quality is critical for epidemiology, and as scientific understanding expands, the range of data available for epidemiological studies and the types of tools used for measurement have also expanded. It is essential for the epidemiologist to have a grasp of the issues involved with different measurement tools. One tool that is increasingly being used for measuring biomarkers in epidemiological cohorts is mass spectrometry (MS), because of the high specificity and sensitivity of MS-based methods and the expanding range of biomarkers that can be measured. Further, the ability of MS to quantify many biomarkers simultaneously is advantageously compared to single biomarker methods. However, as with all methods used to measure biomarkers, there are a number of pitfalls to consider which may have an impact on results when used in epidemiology. In this review we discuss the use of MS for biomarker analyses, focusing on metabolites and their application and potential issues related to large-scale epidemiology studies, the use of MS "omics" approaches for biomarker discovery and how MS-based results can be used for increasing biological knowledge gained from epidemiological studies. Better understanding of the possibilities and possible problems related to MS-based measurements will help the epidemiologist in their discussions with analytical chemists and lead to the use of the most appropriate statistical tools for these data.

  9. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.

  10. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    International Nuclear Information System (INIS)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef

    2014-01-01

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers

  11. Chemical data and statistical analyses from a uranium hydrogeochemical survey of the Rio Ojo Caliente drainage basin, New Mexico. Part I. Water

    International Nuclear Information System (INIS)

    Wenrich-Verbeek, K.J.; Suits, V.J.

    1979-01-01

    This report presents the chemical analyses and statistical evaluation of 62 water samples collected in the north-central part of New Mexico near Rio Ojo Caliente. Both spring and surface-water samples were taken throughout the Rio Ojo Caliente drainage basin above and a few miles below the town of La Madera. A high U concentration (15 μg/l) found in the water of the Rio Ojo Caliente near La Madera, Rio Arriba County, New Mexico, during a regional sampling-technique study in August 1975 by the senior author, was investigated further in May 1976 to determine whether stream waters could be effectively used to trace the source of a U anomaly. A detailed study of the tributaries to the Rio Ojo Caliente, involving 29 samples, was conducted during a moderate discharge period, May 1976, so that small tributaries would contain water. This study isolated Canada de la Cueva as the tributary contributing the anomalous U, so that in May 1977, an extremely low discharge period due to the 1977 drought, an additional 33 samples were taken to further define the anomalous area. 6 references, 3 figures, 6 tables

  12. A statistical analysis of electrical cerebral activity; Contribution a l'etude de l'analyse statistique de l'activite electrique cerebrale

    Energy Technology Data Exchange (ETDEWEB)

    Bassant, Marie-Helene

    1971-01-15

    The aim of this work was to study the statistical properties of the amplitude of the electroencephalographic signal. The experimental method is described (implantation of electrodes, acquisition and treatment of data). The program of the mathematical analysis is given (calculation of probability density functions, study of stationarity) and the validity of the tests discussed. The results concerned ten rabbits. Trips of EEG were sampled during 40 s. with very short intervals (500 μs). The probability density functions established for different brain structures (especially the dorsal hippocampus) and areas, were compared during sleep, arousal and visual stimulus. Using a Χ{sup 2} test, it was found that the Gaussian distribution assumption was rejected in 96.7 per cent of the cases. For a given physiological state, there was no mathematical reason to reject the assumption of stationarity (in 96 per cent of the cases). (author) [French] Le but de ce travail est d'etudier les proprietes statistiques des amplitudes du signal electroencephalographique. La methode experimentale est decrite (implantation d'electrodes, acquisition et traitement des donnees). Le programme d'analyse mathematique est precise (calcul des courbes de repartition statistique, etude de la stationnarite du signal) et la validite des tests, discutee. Les resultats de l'etude portent sur 10 lapins. Des sequences de 40 s d'EEG sont echantillonnees. La valeur de la tension est prelevee a un pas d'echantillonnage de 500 μs. Les courbes de repartition statistiques sont comparees d'une region de l'encephale a l'autre (l'hippocampe dorsal a ete specialement analyse) ceci pendant le sommeil, l'eveil et des stimulations visuelles. Le test du Χ{sup 2} rejette l'hypothese de distribution normale dans 97 pour cent des cas. Pour un etat physiologique donne, il n'existe pas de raison mathematique a ce que soit repoussee l'hypothese de stationnarite, ceci dans 96.7 pour cent des cas. (auteur)

  13. A three-stage hybrid model for regionalization, trends and sensitivity analyses of temperature anomalies in China from 1966 to 2015

    Science.gov (United States)

    Wu, Feifei; Yang, XiaoHua; Shen, Zhenyao

    2018-06-01

    Temperature anomalies have received increasing attention due to their potentially severe impacts on ecosystems, economy and human health. To facilitate objective regionalization and examine regional temperature anomalies, a three-stage hybrid model with stages of regionalization, trends and sensitivity analyses was developed. Annual mean and extreme temperatures were analyzed using the daily data collected from 537 stations in China from 1966 to 2015, including the annual mean, minimum and maximum temperatures (Tm, TNm and TXm) as well as the extreme minimum and maximum temperatures (TNe and TXe). The results showed the following: (1) subregions with coherent temperature changes were identified using the rotated empirical orthogonal function analysis and K-means clustering algorithm. The numbers of subregions were 6, 7, 8, 9 and 8 for Tm, TNm, TXm, TNe and TXe, respectively. (2) Significant increases in temperature were observed in most regions of China from 1966 to 2015, although warming slowed down over the last decade. This warming primarily featured a remarkable increase in its minimum temperature. For Tm and TNm, 95% of the stations showed a significant upward trend at the 99% confidence level. TNe increased the fastest, at a rate of 0.56 °C/decade, whereas 21% of the stations in TXe showed a downward trend. (3) The mean temperatures (Tm, TNm and TXm) in the high-latitude regions increased more quickly than those in the low-latitude regions. The maximum temperature increased significantly at high elevations, whereas the minimum temperature increased greatly at middle-low elevations. The most pronounced warming occurred in eastern China in TNe and northwestern China in TXe, with mean elevations of 51 m and 2098 m, respectively. A cooling trend in TXe was observed at the northwestern end of China. The warming rate in TNe varied the most among the subregions (0.63 °C/decade).

  14. Unique honey bee (Apis mellifera hive component-based communities as detected by a hybrid of phospholipid fatty-acid and fatty-acid methyl ester analyses.

    Directory of Open Access Journals (Sweden)

    Kirk J Grubbs

    Full Text Available Microbial communities (microbiomes are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME and phospholipid-derived fatty acid (PLFA analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

  15. Unique honey bee (Apis mellifera) hive component-based communities as detected by a hybrid of phospholipid fatty-acid and fatty-acid methyl ester analyses.

    Science.gov (United States)

    Grubbs, Kirk J; Scott, Jarrod J; Budsberg, Kevin J; Read, Harry; Balser, Teri C; Currie, Cameron R

    2015-01-01

    Microbial communities (microbiomes) are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME) and phospholipid-derived fatty acid (PLFA) analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

  16. First study of correlation between oleic acid content and SAD gene polymorphism in olive oil samples through statistical and bayesian modeling analyses.

    Science.gov (United States)

    Ben Ayed, Rayda; Ennouri, Karim; Ercişli, Sezai; Ben Hlima, Hajer; Hanana, Mohsen; Smaoui, Slim; Rebai, Ahmed; Moreau, Fabienne

    2018-04-10

    Virgin olive oil is appreciated for its particular aroma and taste and is recognized worldwide for its nutritional value and health benefits. The olive oil contains a vast range of healthy compounds such as monounsaturated free fatty acids, especially, oleic acid. The SAD.1 polymorphism localized in the Stearoyl-acyl carrier protein desaturase gene (SAD) was genotyped and showed that it is associated with the oleic acid composition of olive oil samples. However, the effect of polymorphisms in fatty acid-related genes on olive oil monounsaturated and saturated fatty acids distribution in the Tunisian olive oil varieties is not understood. Seventeen Tunisian olive-tree varieties were selected for fatty acid content analysis by gas chromatography. The association of SAD.1 genotypes with the fatty acids composition was studied by statistical and Bayesian modeling analyses. Fatty acid content analysis showed interestingly that some Tunisian virgin olive oil varieties could be classified as a functional food and nutraceuticals due to their particular richness in oleic acid. In fact, the TT-SAD.1 genotype was found to be associated with a higher proportion of mono-unsaturated fatty acids (MUFA), mainly oleic acid (C18:1) (r = - 0.79, p SAD.1 association with the oleic acid composition of olive oil was identified among the studied varieties. This correlation fluctuated between studied varieties, which might elucidate variability in lipidic composition among them and therefore reflecting genetic diversity through differences in gene expression and biochemical pathways. SAD locus would represent an excellent marker for identifying interesting amongst virgin olive oil lipidic composition.

  17. Voxel-based statistical analysis of cerebral blood flow using Tc-99m ECD brain SPECT in patients with traumatic brain injury: group and individual analyses.

    Science.gov (United States)

    Shin, Yong Beom; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Jae Heung; Yeom, Seok-Ran

    2006-06-01

    Statistical parametric mapping (SPM) was applied to brain perfusion single photon emission computed tomography (SPECT) images in patients with traumatic brain injury (TBI) to investigate regional cerebral abnormalities compared to age-matched normal controls. Thirteen patients with TBI underwent brain perfusion SPECT were included in this study (10 males, three females, mean age 39.8 +/- 18.2, range 21 - 74). SPM2 software implemented in MATLAB 5.3 was used for spatial pre-processing and analysis and to determine the quantitative differences between TBI patients and age-matched normal controls. Three large voxel clusters of significantly decreased cerebral blood perfusion were found in patients with TBI. The largest clusters were area including medial frontal gyrus (voxel number 3642, peak Z-value = 4.31, 4.27, p = 0.000) in both hemispheres. The second largest clusters were areas including cingulated gyrus and anterior cingulate gyrus of left hemisphere (voxel number 381, peak Z-value = 3.67, 3.62, p = 0.000). Other clusters were parahippocampal gyrus (voxel number 173, peak Z-value = 3.40, p = 0.000) and hippocampus (voxel number 173, peak Z-value = 3.23, p = 0.001) in the left hemisphere. The false discovery rate (FDR) was less than 0.04. From this study, group and individual analyses of SPM2 could clearly identify the perfusion abnormalities of brain SPECT in patients with TBI. Group analysis of SPM2 showed hypoperfusion pattern in the areas including medial frontal gyrus of both hemispheres, cingulate gyrus, anterior cingulate gyrus, parahippocampal gyrus and hippocampus in the left hemisphere compared to age-matched normal controls. Also, left parahippocampal gyrus and left hippocampus were additional hypoperfusion areas. However, these findings deserve further investigation on a larger number of patients to be performed to allow a better validation of objective SPM analysis in patients with TBI.

  18. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  19. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  20. A Personalized Rolling Optimal Charging Schedule for Plug-In Hybrid Electric Vehicle Based on Statistical Energy Demand Analysis and Heuristic Algorithm

    Directory of Open Access Journals (Sweden)

    Fanrong Kong

    2017-09-01

    Full Text Available To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost. Although the next-day electricity prices can be obtained in a day-ahead power market, a driving plan is not easily made in advance. Although PHEV owners can input a next-day plan into a charging system, e.g., aggregators, day-ahead, it is a very trivial task to do everyday. Moreover, the driving plan may not be very accurate. To address this problem, in this paper, we analyze energy demands according to a PHEV owner’s historical driving records and build a personalized statistic driving model. Based on the model and the electricity spot prices, a rolling optimization strategy is proposed to help make a charging decision in the current time slot. On one hand, by employing a heuristic algorithm, the schedule is made according to the situations in the following time slots. On the other hand, however, after the current time slot, the schedule will be remade according to the next tens of time slots. Hence, the schedule is made by a dynamic rolling optimization, but it only decides the charging decision in the current time slot. In this way, the fluctuation of electricity prices and driving routine are both involved in the scheduling. Moreover, it is not necessary for PHEV owners to input a day-ahead driving plan. By the optimization simulation, the results demonstrate that the proposed method is feasible to help owners save charging costs and also meet requirements for driving.

  1. Energy, exergy and sustainability analyses of hybrid renewable energy based hydrogen and electricity production and storage systems: Modeling and case study

    International Nuclear Information System (INIS)

    Caliskan, Hakan; Dincer, Ibrahim; Hepbasli, Arif

    2013-01-01

    In this study, hybrid renewable energy based hydrogen and electricity production and storage systems are conceptually modeled and analyzed in detail through energy, exergy and sustainability approaches. Several subsystems, namely hybrid geothermal energy-wind turbine-solar photovoltaic (PV) panel, inverter, electrolyzer, hydrogen storage system, Proton Exchange Membrane Fuel Cell (PEMFC), battery and loading system are considered. Also, a case study, based on hybrid wind–solar renewable energy system, is conducted and its results are presented. In addition, the dead state temperatures are considered as 0 °C, 10 °C, 20 °C and 30 °C, while the environment temperature is 30 °C. The maximum efficiencies of the wind turbine, solar PV panel, electrolyzer, PEMFC are calculated as 26.15%, 9.06%, 53.55%, and 33.06% through energy analysis, and 71.70%, 9.74%, 53.60%, and 33.02% through exergy analysis, respectively. Also, the overall exergy efficiency, ranging from 5.838% to 5.865%, is directly proportional to the dead state temperature and becomes higher than the corresponding energy efficiency of 3.44% for the entire system. -- Highlights: ► Developing a three-hybrid renewable energy (geothermal–wind–solar)-based system. ► Undertaking a parametric study at various dead state temperatures. ► Investigating the effect of dead state temperatures on exergy efficiency

  2. Para-allopatry in hybridizing fire-bellied toads (Bombina bombina and B. variegata): Inference from transcriptome-wide coalescence analyses

    Czech Academy of Sciences Publication Activity Database

    Nürnberger, Beate; Fijarczyk, A.; Lohse, K.; Szymura, J. M.; Blaxter, M. L.

    2016-01-01

    Roč. 70, č. 8 (2016), s. 1803-1818 ISSN 0014-3820 Institutional support: RVO:68081766 Keywords : ecological speciation * genome-wide coalescence * hybrid zone * introgression * RNA-seq. Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 4.201, year: 2016

  3. Statistical properties of coastal long waves analysed through sea-level time-gradient functions: exemplary analysis of the Siracusa, Italy, tide-gauge data

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2016-01-01

    reconstructed sea level (RSL, the background slope (BS and the control function (CF. These functions are examined through a traditional spectral fast Fourier transform (FFT analysis and also through a statistical analysis, showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and RSL and the beta distribution (CF. As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.

  4. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  5. Meta-regression analyses to explain statistical heterogeneity in a systematic review of strategies for guideline implementation in primary health care.

    Directory of Open Access Journals (Sweden)

    Susanne Unverzagt

    Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health

  6. A Personalized Rolling Optimal Charging Schedule for Plug-In Hybrid Electric Vehicle Based on Statistical Energy Demand Analysis and Heuristic Algorithm

    DEFF Research Database (Denmark)

    Kong, Fanrong; Jiang, Jianhui; Ding, Zhigang

    2017-01-01

    To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs) have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost...

  7. Analysing mass balance of viruses in a coagulation-ceramic microfiltration hybrid system by a combination of the polymerase chain reaction (PCR) method and the plaque forming units (PFU) method.

    Science.gov (United States)

    Matsushita, T; Matsui, Y; Shirasaki, N

    2006-01-01

    Virus removal experiments using river water spiked with bacteriophages were conducted by an in-line coagulation-ceramic microfiltration hybrid system to investigate the effects of filtration flux (62.5 and 125 L/(m2 x h)) and type of virus (Qbeta and MS2) on virus removal. In addition, the mass balance of viruses through the hybrid system was analysed by quantifying the infectious and inactive viruses by a combination of the polymerase chain reaction (PCR) method and the plaque forming units (PFU) method. Even when the system was operated at high filtration flux (125 L/(m2 x h)), high virus removal (> 6 log) with short coagulation time (2.4 s) was successfully achieved by dosing polyaluminium chloride (PACI) at more than 1.08 mg-Al/L. Removal performances were different between Qbeta and MS2, although their diameters are almost the same: greater virus removal was achieved for MS2 at PACI dosing of 0.54 mg-Al/L, and for Qbeta at PACI dosing of more than 1.08 mg-Al/L. The combination of the PCR and PFU methods revealed that two phenomena, adsorption to/entrapment in aluminium floc and virucidal activity of PACI, partially account for the high virus removal in the coagulation-MF hybrid system.

  8. Discovery and characterisation of dietary patterns in two Nordic countries. Using non-supervised and supervised multivariate statistical techniques to analyse dietary survey data

    DEFF Research Database (Denmark)

    Edberg, Anna; Freyhult, Eva; Sand, Salomon

    - and inter-national data excerpts. For example, major PCA loadings helped deciphering both shared and disparate features, relating to food groups, across Danish and Swedish preschool consumers. Data interrogation, reliant on the above-mentioned composite techniques, disclosed one outlier dietary prototype...... prototype with the latter property was identified also in the Danish data material, but without low consumption of Vegetables or Fruit & berries. The second MDA-type of data interrogation involved Supervised Learning, also known as Predictive Modelling. These exercises involved the Random Forest (RF...... not elaborated on in-depth, output from several analyses suggests a preference for energy-based consumption data for Cluster Analysis and Predictive Modelling, over those appearing as weight....

  9. Application of multivariate statistical analyses in the interpretation of geochemical behaviour of uranium in phosphatic rocks in the Red Sea, Nile Valley and Western Desert, Egypt

    International Nuclear Information System (INIS)

    El-Arabi, A.M.Abd El-Gabar M.; Khalifa, Ibrahim H.

    2002-01-01

    Factor and cluster analyses as well as the Pearson correlation coefficient have been applied to geochemical data obtained from phosphorite and phosphatic rocks of Duwi Formation exposed at the Red Sea coast, Nile Valley and Western Desert. Sixty-six samples from a total of 71 collected samples were analysed for SiO 2 , TiO 2 , Al 2 O 3 , Fe 2 O 3 , CaO, MgO, Na 2 O, K 2 O, P 2 O 5 , Sr, U and Pb by XRF and their mineral constituents were determined by the use of XRD techniques. In addition, the natural radioactivity of the phosphatic samples due to their uranium, thorium and potassium contents was measured by gamma-spectrometry.The uranium content in the phosphate rocks with P 2 O 5 >15% (average of 106.6 ppm) is higher than in rocks with P 2 O 5 2 O 5 and CaO, whereas it is not related to changes in SiO 2 , TiO 2 , Al 2 O 3 , Fe 2 O 3 , MgO, Na 2 O and K 2 O concentrations.Factor analysis and the Pearson correlation coefficient revealed that uranium behaves geochemically in different ways in the phosphatic sediments and phosphorites in the Red Sea, Nile Valley and Western Desert. In the Red Sea and Western Desert phosphorites, uranium occurs mainly in oxidized U 6+ state where it seems to be fixed by the phosphate ion, forming secondary uranium phosphate minerals such as phosphuranylite.In the Nile Valley phosphorites, ionic substitution of Ca 2+ by U 4+ is the main controlling factor in the concentration of uranium in phosphate rocks. Moreover, fixation of U 6+ by phosphate ion and adsorption of uranium on phosphate minerals play subordinate roles

  10. Overview and statistical failure analyses of the electrical insulation system for the SSC long dipole magnets from an industrialization point of view

    International Nuclear Information System (INIS)

    Roach, J.F.

    1992-01-01

    The electrical insulation system of the SSC long dipole magnets is reviewed and potential dielectric failure modes discussed. Electrical insulation fabrication and assembly issues with respect to rate production manufacturability are addressed. The automation required for rate assembly of electrical insulation components will require critical online visual and dielectric screening tests to insure production quality. Storage and assembly areas must bc designed to prevent foreign particles from becoming entrapped in the insulation during critical coil winding, molding, and collaring operations. All hand assembly procedures involving dielectrics must be performed with rigorous attention to their impact on insulation integrity. Individual dipole magnets must have a sufficiently low probability of electrical insulation failure under all normal and fault mode voltage conditions such that the series of magnets in the SSC rings have acceptable Mean Time Between Failure (MTBF) with respect to dielectric mode failure events. Statistical models appropriate for large electrical system breakdown failure analysis are applied to the SSC magnet rings. The MTBF of the SSC system is related to failure data base for individual dipole magnet samples

  11. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    Science.gov (United States)

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Statistical analyses of in-situ and soil-sample measurements for radionuclides in surface soil near the 116-K-2 trench

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Klover, W.J.

    1988-09-01

    Radiation detection surveys are used at the US Department of Energy's Hanford Reservation near Richland, Washington, to determine areas that need posting as radiation zones or to measure dose rates in the field. The relationship between measurements made by Sodium Iodide (NaI) detectors mounted on the mobile Road Monitor vehicle and those made by hand-held GM P-11 probes and Micro-R meters are of particular interest because the Road Monitor can survey land areas in much less time than hand-held detectors. Statistical regression methods are used here to develop simple equations to predict GM P-11 probe gross gamma count-per-minute (cpm) and Micro-R-Meter μR/h measurements on the basis of NaI gross gamma count-per-second (cps) measurements obtained using the Road Monitor. These equations were estimated using data collected near the 116-K-2 Trench in the 100-K area on the Hanford Reservation. Equations are also obtained for estimating upper and lower limits within which the GM P-11 or Micro-R-Meter measurement corresponding to a given NaI Road Monitor measurement at a new location is expected to fall with high probability. An equation and limits for predicting GM P-11 measurements on the basis of Micro-R- Meter measurements is also estimated. Also, we estimate an equation that may be useful for approximating the 90 Sr measurement of a surface soil sample on the basis of a spectroscopy measurement for 137 Cs on that sample. 3 refs., 16 figs., 44 tabs

  13. Mirror fusion--fission hybrids

    International Nuclear Information System (INIS)

    Lee, J.D.

    1978-01-01

    The fusion-fission concept and the mirror fusion-fission hybrid program are outlined. Magnetic mirror fusion drivers and blankets for hybrid reactors are discussed. Results of system analyses are presented and a reference design is described

  14. Error correction and statistical analyses for intra-host comparisons of feline immunodeficiency virus diversity from high-throughput sequencing data.

    Science.gov (United States)

    Liu, Yang; Chiaromonte, Francesca; Ross, Howard; Malhotra, Raunaq; Elleder, Daniel; Poss, Mary

    2015-06-30

    in G to A substitutions, but found no evidence for this host defense strategy. Our error correction approach for minor allele frequencies (more sensitive and computationally efficient than other algorithms) and our statistical treatment of variation (ANOVA) were critical for effective use of high-throughput sequencing data in understanding viral diversity. We found that co-infection with PLV shifts FIV diversity from bone marrow to lymph node and spleen.

  15. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  16. Analyses of Genotypes and Phenotypes of Ten Chinese Patients with Wolf-Hirschhorn Syndrome by Multiplex Ligation-dependent Probe Amplification and Array Comparative Genomic Hybridization.

    Science.gov (United States)

    Yang, Wen-Xu; Pan, Hong; Li, Lin; Wu, Hai-Rong; Wang, Song-Tao; Bao, Xin-Hua; Jiang, Yu-Wu; Qi, Yu

    2016-03-20

    Wolf-Hirschhorn syndrome (WHS) is a contiguous gene syndrome that is typically caused by a deletion of the distal portion of the short arm of chromosome 4. However, there are few reports about the features of Chinese WHS patients. This study aimed to characterize the clinical and molecular cytogenetic features of Chinese WHS patients using the combination of multiplex ligation-dependent probe amplification (MLPA) and array comparative genomic hybridization (array CGH). Clinical information was collected from ten patients with WHS. Genomic DNA was extracted from the peripheral blood of the patients. The deletions were analyzed by MLPA and array CGH. All patients exhibited the core clinical symptoms of WHS, including severe growth delay, a Greek warrior helmet facial appearance, differing degrees of intellectual disability, and epilepsy or electroencephalogram anomalies. The 4p deletions ranged from 2.62 Mb to 17.25 Mb in size and included LETM1, WHSC1, and FGFR3. The combined use of MLPA and array CGH is an effective and specific means to diagnose WHS and allows for the precise identification of the breakpoints and sizes of deletions. The deletion of genes in the WHS candidate region is closely correlated with the core WHS phenotype.

  17. W-CDMA Uplink Capacity and Interference Statistics of a LongGroove-Shaped Road Microcells Using A Hybrid Propagation Model

    Directory of Open Access Journals (Sweden)

    L. de Haro-Ariet

    2003-09-01

    Full Text Available The uplink capacity and the interference statistics of the sectorsof a long groove-shaped road W-CDMA microcell are studied. A model of 9microcells in a groove-shaped road is used to analyze the uplink. Ahybrid model for the propagation is used in the analysis. The capacityand the interference statistics of the cell are studied for differentsector ranges, different specific attenuation factors, differentantenna side lobe levels and different bend losses.

  18. Analyse des effets de deux modalités de constitution des groupes dans un dispositif hybride de formation à distance

    Directory of Open Access Journals (Sweden)

    Christian Depover

    2004-01-01

    Full Text Available Cette étude porte sur les effets de deux modalités de constitution des groupes (spontané versus contrasté dans un dispositif hybride de formation à distance destiné à des étudiants universitaires. Les scénarios d’apprentissage mis en œuvre à l’occasion de cette expérience reposent sur l’utilisation des cartes conceptuelles comme support au travail collaboratif. Les résultats observés n’ont pas permis de mettre en évidence de différence quant à la densité conceptuelle des cartes ou au nombre d’unités de sens produites. Par contre, il apparaît que les unités de sens qui concernent les activités de planification et les commentaires métacognitifs sont nettement plus nombreuses au sein des groupes constitués par pairage contrasté. Une corrélation positive entre l’intensité des interactions à l’intérieur du forum et la densité conceptuelle des cartes produites a également été mise en évidence pour les groupes constitués sur la base d’un pairage contrasté. En ce qui concerne le pairage spontané, nos résultats indiquent que les paires constituées témoignent d’un comportement plus homogène, se révèlent plus collaboratives et consacrent moins d’effort à la planification du travail de groupe.

  19. Analyse de l'emballement thermique d'un système chimique hybride non tempéré

    OpenAIRE

    Véchot , Luc; Bigot , Jean-Pierre; Minko , Wilfried ,; Kazmierczak , Marc; Vicot , Patricia

    2009-01-01

    National audience; Ce travail s'intéresse au “blow-down” (emballement thermique en présence d'un évent de sécurité) d'un système chimique non tempéré (30% CHP) soumis à un incendie. Il utilise une maquette à l'échelle 0,1 l. L'analyse des données post décomposition a montré que la vapeur présente est principalement un produit de la réaction. Toutes les expériences de blow-down ont présenté deux pics de pression, quel que soit le rapport A/V, ce qui est typique des systèmes non tempérés. Nous ...

  20. Hydrogeologic characterization and evolution of the 'excavation damaged zone' by statistical analyses of pressure signals: application to galleries excavated at the clay-stone sites of Mont Terri (Ga98) and Tournemire (Ga03)

    International Nuclear Information System (INIS)

    Fatmi, H.; Ababou, R.; Matray, J.M.; Joly, C.

    2010-01-01

    Document available in extended abstract form only. This paper presents methods of statistical analysis and interpretation of hydrogeological signals in clayey formations, e.g., pore water pressure and atmospheric pressure. The purpose of these analyses is to characterize the hydraulic behaviour of this type of formation in the case of a deep repository of Mid- Level/High-Level and Long-lived radioactive wastes, and to study the evolution of the geologic formation and its EDZ (Excavation Damaged Zone) during the excavation of galleries. We focus on galleries Ga98 and Ga03 in the sites of Mont Terri (Jura, Switzerland) and Tournemire (France, Aveyron), through data collected in the BPP- 1 and PH2 boreholes, respectively. The Mont Terri site, crossing the Aalenian Opalinus clay-stone, is an underground laboratory managed by an international consortium, namely the Mont Terri project (Switzerland). The Tournemire site, crossing the Toarcian clay-stone, is an Underground Research facility managed by IRSN (France). We have analysed pore water and atmospheric pressure signals at these sites, sometimes in correlation with other data. The methods of analysis are based on the theory of stationary random signals (correlation functions, Fourier spectra, transfer functions, envelopes), and on multi-resolution wavelet analysis (adapted to nonstationary and evolutionary signals). These methods are also combined with filtering techniques, and they can be used for single signals as well as pairs of signals (cross-analyses). The objective of this work is to exploit pressure measurements in selected boreholes from the two compacted clay sites, in order to: - evaluate phenomena affecting the measurements (earth tides, barometric pressures..); - estimate hydraulic properties (specific storage..) of the clay-stones prior to excavation works and compare them with those estimated by pulse or slug tests on shorter time scales; - analyze the effects of drift excavation on pore pressures

  1. Analysis and Modeling for Short- to Medium-Term Load Forecasting Using a Hybrid Manifold Learning Principal Component Model and Comparison with Classical Statistical Models (SARIMAX, Exponential Smoothing and Artificial Intelligence Models (ANN, SVM: The Case of Greek Electricity Market

    Directory of Open Access Journals (Sweden)

    George P. Papaioannou

    2016-08-01

    Full Text Available In this work we propose a new hybrid model, a combination of the manifold learning Principal Components (PC technique and the traditional multiple regression (PC-regression, for short and medium-term forecasting of daily, aggregated, day-ahead, electricity system-wide load in the Greek Electricity Market for the period 2004–2014. PC-regression is shown to effectively capture the intraday, intraweek and annual patterns of load. We compare our model with a number of classical statistical approaches (Holt-Winters exponential smoothing of its generalizations Error-Trend-Seasonal, ETS models, the Seasonal Autoregressive Moving Average with exogenous variables, Seasonal Autoregressive Integrated Moving Average with eXogenous (SARIMAX model as well as with the more sophisticated artificial intelligence models, Artificial Neural Networks (ANN and Support Vector Machines (SVM. Using a number of criteria for measuring the quality of the generated in-and out-of-sample forecasts, we have concluded that the forecasts of our hybrid model outperforms the ones generated by the other model, with the SARMAX model being the next best performing approach, giving comparable results. Our approach contributes to studies aimed at providing more accurate and reliable load forecasting, prerequisites for an efficient management of modern power systems.

  2. Analysis and Experimental Implementation of a Heuristic Strategy for Onboard Energy Management of a Hybrid Solar Vehicle Analyse et expérimentation d’une stratégie heuristique pour la gestion d’énergie à bord d’un véhicule hybride solaire

    Directory of Open Access Journals (Sweden)

    Coraggio G.

    2013-05-01

    Full Text Available This paper focuses on the simulation analysis and the experimental implementation of a Rule-Based (RB control strategy for on-board energy management of a Hybrid Solar Vehicle (HSV, consisting in a series hybrid electric vehicle assisted by photovoltaic panels. The RB strategy consists of two tasks: one external, which determines the final battery State of Charge (SOC to be reached at the end of the driving schedule to allow full exploitation of solar energy during parking phase; the other internal, whose aim is to define the optimal Electric Generator (ICE-EG power trajectory and SOC oscillation around the final value. This control strategy has been implemented in a real time NI® cRIO control unit, thus allowing to perform experimental tests for energy management validation on a real HSV prototype developed at the University of Salerno. Ce document présente l’analyse et la mise en oeuvre d’expérimentation de règles bases RB (Rule Base de stratégie de contrôle pour la gestion d’énergie à bord d’un véhicule hybride solaire HSV (Hybrid Solar Vehicle qui est constitué d’un véhicule hybride électrique fabriqué en série et alimenté par des panneaux photovoltaïques. La stratégie RB se compose de deux tâches : l’une externe, qui détermine l’état final de charge de la batterie (SOC, State of Charge qui doit être atteint à la fin du cycle de conduite pour permettre la pleine exploitation de l’énergie solaire pendant la phase de stationnement, l’autre interne, dont le but est de définir le générateur électrique optimal (ICEEG, Internal Combustion Engine – Electric Generator, la trajectoire de la puissance et l’oscillation du SOC autour de la valeur finale. Cette stratégie de contrôle a été mise en oeuvre en temps réel dans une unité de contrôle NI®cRIO (National Instruments compact RIO, permettant ainsi d’effectuer des essais expérimentaux pour la validation de la gestion d’énergie sur un

  3. Analyse spatiale et statistique de l’âge du Fer en France. L’exemple de la “ BaseFer ” Spatial and statistical analysis of the Iron Age in France. The example of 'basefer'

    Directory of Open Access Journals (Sweden)

    Olivier Buchsenschutz

    2009-05-01

    Full Text Available Le développement des systèmes d'information géographique (SIG permet d'introduire dans les bases de données archéologiques la localisation des données. Il est possible alors d'obtenir des cartes de répartition qu'il s'agit ensuite d'interpréter en s’appuyant sur des analyses statistiques et spatiales. Cartes et statistiques mettent en évidence l'état de la recherche, les conditions de conservation des sites, et au-delà des phénomènes historiques ou culturels.À travers un programme de recherche sur l'âge du Fer en France (Basefer une base de données globale a été constituée pour l'espace métropolitain. Cet article propose un certain nombre d'analyses sur les critères descriptifs généraux d’un corpus de 11 000 sites (les départements côtiers de la Méditerranée ne sont pas traités dans ce test. Le contrôle et le développement des rubriques plus fines seront réalisés avec une équipe élargie, avant une mise en réseau de la base.The development of Geographical Information Systems (GIS allows information in archaeological databases to be georeferenced. It is thus possible to obtain distribution maps which can then be interpreted using statistical and spatial analyses. Maps and statistics highlight the state of research, the condition of sites, and moreover historical and cultural phenomena.Through a research programme on the Iron Age in France (Basefer, a global database was established for the entire country. This article puts forward some analyses of the general descriptive criteria represented in a corpus of 11000 sites (departments along the Mediterranean Sea coast are excluded from this test. The control and development of finer descriptors will be undertaken by an enlarged team, before the data are networked.

  4. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  5. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  6. Statistical sampling approaches for soil monitoring

    NARCIS (Netherlands)

    Brus, D.J.

    2014-01-01

    This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid

  7. Field errors in hybrid insertion devices

    International Nuclear Information System (INIS)

    Schlueter, R.D.

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed

  8. Field errors in hybrid insertion devices

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, R.D. [Lawrence Berkeley Lab., CA (United States)

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  9. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  10. Design, Synthesis, and Analysis of Minor Groove Binder Pyrrolepolyamide-2′-Deoxyguanosine Hybrids

    Directory of Open Access Journals (Sweden)

    Etsuko Kawashima

    2010-01-01

    Full Text Available Pyrrolepolyamide-2′-deoxyguanosine hybrids (Hybrid 2 and Hybrid 3 incorporating the 3-aminopropionyl or 3-aminopropyl linker were designed and synthesized on the basis of previously reported results of a pyrrolepolyamide-adenosine hybrid (Hybrid 1. Evaluation of the DNA binding sequence selectivity of pyrrolepolyamide-2′-deoxyguanosine hybrids was performed by CD spectral and Tm analyses. It was shown that Hybrid 3 possessed greater binding specificity than distamycin A, Hybrid 1 and Hybrid 2.

  11. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  12. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  13. Analyses of the use of natural gas in solar power plants (CSP) hybridization in the Sao Francisco Basin (BA); Analise do uso de gas natural na hibridizacao de plantas termosolares (CSP) na Bacia do Sao Francisco (BA)

    Energy Technology Data Exchange (ETDEWEB)

    Malagueta, Diego Cunha; Penafiel, Rafael Andres Soria; Szklo, Alexandre Salem; Dutra, Ricardo M.; Schaeffer, Roberto [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), RJ (Brazil)

    2012-07-01

    This study assessed the feasibility of Concentrated Solar Power plants (CSP) in Northeast, Brazil. It focused on parabolic trough solar power plants, which is the most mature CSP technology; and evaluated plants rated at 100 MWe, dry cooling systems (due to the low water availability in Northeast), and with and without hybridization based on natural gas (degree of hybridization varying from 25 to 75%). Hence, the capacity factor of the simulated plants hovered between 23 and 98%, according to the degree of hybridization and the choice of the thermodynamic cycle of the natural gas fueled thermal system: Rankine or combined cycle. The CSP plants were simulated at Bom Jesus da Lapa, in the semi-arid region of Bahia. Given the prospects for natural gas resources in the Sao Francisco Basin, different scenarios for the gas prices were tested. Moreover, two scenarios were tested for the cost of the CSP plants, one based on the current financial environment and the other based on incentive policies, such as fiscal incentives and loans. Findings show that while simple plants levelized costs (LCOE) hovered around 520 R$/MWh, for hybrid plants LCOE may reach 140 to 190 R$/MWh. Therefore, this study proposed incentive policies to promote the increasing investment in hybrid CSP plants. (author)

  14. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  15. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  16. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  17. Distinguishing between incomplete lineage sorting and genomic introgressions: complete fixation of allospecific mitochondrial DNA in a sexually reproducing fish (Cobitis; Teleostei, despite clonal reproduction of hybrids.

    Directory of Open Access Journals (Sweden)

    Lukas Choleva

    Full Text Available Distinguishing between hybrid introgression and incomplete lineage sorting causing incongruence among gene trees in that they exhibit topological differences requires application of statistical approaches that are based on biologically relevant models. Such study is especially challenging in hybrid systems, where usual vectors mediating interspecific gene transfers--hybrids with Mendelian heredity--are absent or unknown. Here we study a complex of hybridizing species, which are known to produce clonal hybrids, to discover how one of the species, Cobitis tanaitica, has achieved a pattern of mito-nuclear mosaic genome over the whole geographic range. We appplied three distinct methods, including the method using solely the information on gene tree topologies, and found that the contrasting mito-nuclear signal might not have resulted from the retention of ancestral polymorphism. Instead, we found two signs of hybridization events related to C. tanaitica; one concerning nuclear gene flow and the other suggested mitochondrial capture. Interestingly, clonal inheritance (gynogenesis of contemporary hybrids prevents genomic introgressions and non-clonal hybrids are either absent or too rare to be detected among European Cobitis. Our analyses therefore suggest that introgressive hybridizations are rather old episodes, mediated by previously existing hybrids whose inheritance was not entirely clonal. Cobitis complex thus supports the view that the type of resulting hybrids depends on a level of genomic divergence between sexual species.

  18. Age and gender effects on normal regional cerebral blood flow studied using two different voxel-based statistical analyses; Effets de l'age et du genre sur la perfusion cerebrale regionale etudiee par deux methodes d'analyse statistique voxel-par-voxel

    Energy Technology Data Exchange (ETDEWEB)

    Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T. [Universite Catholique de Louvain, Service de Medecine Nucleaire, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); Van Laere, K. [Leuven Univ. Hospital, Nuclear Medicine Div. (Belgium); Jamart, J. [Universite Catholique de Louvain, Dept. de Biostatistiques, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); D' Asseler, Y. [Ghent Univ., Medical Signal and Image Processing Dept. (MEDISIP), Faculty of applied sciences (Belgium); Minoshima, S. [Washington Univ., Dept. of Radiology, Seattle (United States)

    2009-10-15

    Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine {sup 99m}Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)

  19. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  20. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  1. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  2. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  3. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  4. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  5. The plus-hybrid effect on the grain yield of two ZP maize hybrids

    Directory of Open Access Journals (Sweden)

    Božinović Sofija

    2010-01-01

    Full Text Available The combined effect of cytoplasmic male sterility and xenia on maize hybrid traits is referred to as the plus-hybrid effect. Two studied ZP hybrids differently responded to this effect for grain yield. All plus-hybrid combinations of the firstly observed hybrid had a higher yield than their fertile counterparts, but not significantly, while only one combination of the second hybrid positively responded, also without statistical significance. It seems that the observed effect mostly depended on the genotype of the female component.

  6. Analysis of Non-binary Hybrid LDPC Codes

    OpenAIRE

    Sassatelli, Lucile; Declercq, David

    2008-01-01

    In this paper, we analyse asymptotically a new class of LDPC codes called Non-binary Hybrid LDPC codes, which has been recently introduced. We use density evolution techniques to derive a stability condition for hybrid LDPC codes, and prove their threshold behavior. We study this stability condition to conclude on asymptotic advantages of hybrid LDPC codes compared to their non-hybrid counterparts.

  7. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  8. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  9. Lehrer in der Bundesrepublik Deutschland. Eine Kritische Analyse Statistischer Daten uber das Lehrpersonal an Allgemeinbildenden Schulen. (Education in the Federal Republic of Germany. A Statistical Study of Teachers in Schools of General Education.)

    Science.gov (United States)

    Kohler, Helmut

    The purpose of this study was to analyze the available statistics concerning teachers in schools of general education in the Federal Republic of Germany. An analysis of the demographic structure of the pool of full-time teachers showed that in 1971 30 percent of the teachers were under age 30, and 50 percent were under age 35. It was expected that…

  10. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  11. Hybrid reactors

    International Nuclear Information System (INIS)

    Moir, R.W.

    1980-01-01

    The rationale for hybrid fusion-fission reactors is the production of fissile fuel for fission reactors. A new class of reactor, the fission-suppressed hybrid promises unusually good safety features as well as the ability to support 25 light-water reactors of the same nuclear power rating, or even more high-conversion-ratio reactors such as the heavy-water type. One 4000-MW nuclear hybrid can produce 7200 kg of 233 U per year. To obtain good economics, injector efficiency times plasma gain (eta/sub i/Q) should be greater than 2, the wall load should be greater than 1 MW.m -2 , and the hybrid should cost less than 6 times the cost of a light-water reactor. Introduction rates for the fission-suppressed hybrid are usually rapid

  12. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  13. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  14. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  15. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  16. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  17. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  18. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  19. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  20. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  1. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  2. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  3. Notices about using elementary statistics in psychology

    OpenAIRE

    松田, 文子; 三宅, 幹子; 橋本, 優花里; 山崎, 理央; 森田, 愛子; 小嶋, 佳子

    2003-01-01

    Improper uses of elementary statistics that were often observed in beginners' manuscripts and papers were collected and better ways were suggested. This paper consists of three parts: About descriptive statistics, multivariate analyses, and statistical tests.

  4. Does environmental data collection need statistics?

    NARCIS (Netherlands)

    Pulles, M.P.J.

    1998-01-01

    The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental

  5. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  6. Pedestrian and motorists' actions at pedestrian hybrid beacon sites: findings from a pilot study.

    Science.gov (United States)

    Pulugurtha, Srinivas S; Self, Debbie R

    2015-01-01

    This paper focuses on an analysis of pedestrian and motorists' actions at sites with pedestrian hybrid beacons and assesses their effectiveness in improving the safety of pedestrians. Descriptive and statistical analyses (one-tail two-sample T-test and two-proportion Z-test) were conducted using field data collected during morning and evening peak hours at three study sites in the city of Charlotte, NC, before and after the installation of pedestrian hybrid beacons. Further, an analysis was conducted to assess the change in pedestrian and motorists' actions over time (before the installation; 1 month, 3 months, 6 months, and 12 months after the installation). Results showed an increase in average traffic speed at one of the pedestrian hybrid beacon sites while no specific trends were observed at the other two pedestrian hybrid beacon sites. A decrease in the number of motorists not yielding to pedestrians, pedestrians trapped in the middle of the street, and pedestrian-vehicle conflicts were observed at all the three pedestrian hybrid beacon sites. The installation of pedestrian hybrid beacons did not have a negative effect on pedestrian actions at two out of the three sites. Improvements seem to be relatively more consistent 3 months after the installation of the pedestrian hybrid beacon.

  7. Hybrid composites

    CSIR Research Space (South Africa)

    Jacob John, Maya

    2009-04-01

    Full Text Available mixed short sisal/glass hybrid fibre reinforced low density polyethylene composites was investigated by Kalaprasad et al [25].Chemical surface modifications such as alkali, acetic anhydride, stearic acid, permanganate, maleic anhydride, silane...

  8. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    Science.gov (United States)

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Hybrid intermediaries

    OpenAIRE

    Cetorelli, Nicola

    2014-01-01

    I introduce the concept of hybrid intermediaries: financial conglomerates that control a multiplicity of entity types active in the "assembly line" process of modern financial intermediation, a system that has become known as shadow banking. The complex bank holding companies of today are the best example of hybrid intermediaries, but I argue that financial firms from the "nonbank" space can just as easily evolve into conglomerates with similar organizational structure, thus acquiring the cap...

  10. Statistical learning and prejudice.

    Science.gov (United States)

    Madison, Guy; Ullén, Fredrik

    2012-12-01

    Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.

  11. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  12. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  13. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  14. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  15. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  16. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  17. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  18. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  19. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  20. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  1. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  2. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  3. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  4. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  5. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  6. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  7. Statistical analyses for the purpose of an early detection of global and regional climate change due to the anthropogenic greenhouse effect; Statistische Analysen zur Frueherkennung globaler und regionaler Klimaaenderungen aufgrund des anthropogenen Treibhauseffektes

    Energy Technology Data Exchange (ETDEWEB)

    Grieser, J.; Staeger, T.; Schoenwiese, C.D.

    2000-03-01

    The report answers the question where, why and how different climate variables have changed within the last 100 years. The analyzed variables are observed time series of temperature (mean, maximum, minimum), precipitation, air pressure, and water vapour pressure in a monthly resolution. The time series are given as station data and grid box data as well. Two kinds of time-series analysis are performed. The first is applied to find significant changes concerning mean and variance of the time series. Thereby also changes in the annual cycle and frequency of extreme events arise. The second approach is used to detect significant spatio-temporal patterns in the variations of climate variables, which are most likely driven by known natural and anthropogenic climate forcings. Furtheron, an estimation of climate noise allows to indicate regions where certain climate variables have changed significantly due to the enhanced anthropogenic greenhouse effect. (orig.) [German] Der Bericht gibt Antwort auf die Frage, wo sich welche Klimavariable wie und warum veraendert hat. Ausgangspunkt der Analyse sind huntertjaehrige Zeitreihen der Temperatur (Mittel, Maximum, Minimum), des Niederschlags, Luftdrucks und Wasserdampfpartialdrucks in monatlicher Aufloesung. Es wurden sowohl Stationsdaten als auch Gitterpunktdaten verwendet. Mit Hilfe der strukturorientierten Zeitreihenzerlegung wurden signifikankte Aenderungen im Mittel und in der Varianz der Zeitreihen gefunden. Diese betreffen auch Aenderungen im Jahresgang und in der Haeufigkeit extremer Ereignisse. Die ursachenorientierte Zeitreihenzerlegung selektiert signifikante raumzeitliche Variationen der Klimavariablen, die natuerlichen bzw. anthropogenen Klimaantrieben zugeordnet werden koennen. Eine Abschaetzung des Klimarauschens erlaubt darueber hinaus anzugeben, wo und wie signifikant der anthropogene Treibhauseffekt welche Klimavariablen veraendert hat. (orig.)

  8. MEVSİMSEL DÜZELTMEDE KULLANILAN İSTATİSTİKİ YÖNTEMLER ÜZERİNE BİR İNCELEME-AN ANALYSE ON STATISTICAL METHODS WHICH ARE USED FOR SEASONAL ADJUSTMENT

    Directory of Open Access Journals (Sweden)

    Handan YOLSAL

    2012-06-01

    Full Text Available Bu makalenin amacı zaman serileri için resmi istatistik ajansları tarafından geliştirilen ve çok yaygın olarak uygulanan mevsim düzeltme programlarını tanıtmaktır. Bu programlar iki ana grupta sınıflanmaktadır. Bunlardan biri, ilk defa olarak NBER tarafından geliştirilen ve hareketli ortalamalar filtreleri kullanan CENSUS II X-11 ailesidir. Bu aile X-11 ARIMA ve X-12 ARIMA tekniklerini içerir. Diğeri ise İspanya Merkez Bankası tarafından geliştirilen ve model bazlı bir yaklaşım olan TRAMO/SEATS programıdır. Bu makalede sözü edilen tekniklerin mevsimsel ayrıştırma süreçleri, bu tekniklerin içerdiği ticari gün, takvim etkisi gibi bazı özel etkiler, avantaj ve dezavantajları ve ayrıca öngörü performansları tartışılacaktır.-This paper’s aim is to introduce most commonly applied seasonal adjustment programs improved by official statistical agencies for the time series. These programs are classified in two main groups. One of them is the family of  CENSUS II X-11 which was using moving average filters and was first developed by NBER. This family involves X-11 ARIMA and X-12 ARIMA techniques. The other one is TRAMO/SEATS program which was a model based approach and has been developed by Spain Central Bank. The seasonal decomposition procedures of these techniques which are mentioned before and consisting of some special effects such as trading day, calendar effects and their advantages-disadvantages and also forecasting performances of them will be discussed in this paper.

  9. Gamma-ray astronomy from the ground and the space: first analyses of the HESS-II hybrid array and search for blazar candidates among the unidentified Fermi-LAT sources

    International Nuclear Information System (INIS)

    Lefaucheur, Julien

    2015-01-01

    This manuscript is about high energy gamma-ray astronomy (between 30 GeV and 300 GeV) with the Fermi-LAT satellite and very high energy gamma-ray astronomy (above ∼100 GeV) via the H.E.S.S. experiment. The second phase of the H.E.S.S. experiment began in July 2012 with the inauguration of a fifth 28 m-diameter telescope added to the initial array composed of four 12 m-diameter imaging atmospheric Cherenkov telescopes. In the first part of this thesis, we present the development of an analysis in hybrid mode based on a multivariate method dedicated to detect and study sources with different spectral shapes and the first analysis results on real data. The second part is dedicated to the research of blazar candidates among the Fermi-LAT unidentified sources of the 2FGL catalog. A first development is based on a multivariate approach using discriminant parameters built with the 2FGL catalog parameters. A second development is done with the use of the WISE satellite catalog and a non-parametric technic in order to find the blazar-like infrared counterparts of the unidentified sources of the 2FGL catalog. (author)

  10. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  11. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  12. Near-term hybrid vehicle program, phase 1. Appendix D: Sensitivity analysis resport

    Science.gov (United States)

    1979-01-01

    Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.

  13. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  14. Hybrid stars

    Indian Academy of Sciences (India)

    Hybrid stars. AsHOK GOYAL. Department of Physics and Astrophysics, University of Delhi, Delhi 110 007, India. Abstract. Recently there have been important developments in the determination of neutron ... number and the electric charge. ... available to the system to rearrange concentration of charges for a given fraction of.

  15. A Prototype Regional GSI-based EnKF-Variational Hybrid Data Assimilation System for the Rapid Refresh Forecasting System: Dual-Resolution Implementation and Testing Results

    Science.gov (United States)

    Pan, Yujie; Xue, Ming; Zhu, Kefeng; Wang, Mingjun

    2018-05-01

    A dual-resolution (DR) version of a regional ensemble Kalman filter (EnKF)-3D ensemble variational (3DEnVar) coupled hybrid data assimilation system is implemented as a prototype for the operational Rapid Refresh forecasting system. The DR 3DEnVar system combines a high-resolution (HR) deterministic background forecast with lower-resolution (LR) EnKF ensemble perturbations used for flow-dependent background error covariance to produce a HR analysis. The computational cost is substantially reduced by running the ensemble forecasts and EnKF analyses at LR. The DR 3DEnVar system is tested with 3-h cycles over a 9-day period using a 40/˜13-km grid spacing combination. The HR forecasts from the DR hybrid analyses are compared with forecasts launched from HR Gridpoint Statistical Interpolation (GSI) 3D variational (3DVar) analyses, and single LR hybrid analyses interpolated to the HR grid. With the DR 3DEnVar system, a 90% weight for the ensemble covariance yields the lowest forecast errors and the DR hybrid system clearly outperforms the HR GSI 3DVar. Humidity and wind forecasts are also better than those launched from interpolated LR hybrid analyses, but the temperature forecasts are slightly worse. The humidity forecasts are improved most. For precipitation forecasts, the DR 3DEnVar always outperforms HR GSI 3DVar. It also outperforms the LR 3DEnVar, except for the initial forecast period and lower thresholds.

  16. Hybridization analysis of P2 phage and of a defective prophage of Escherichia Coli B by the density gradient centrifugation method; Analyse de l'hybridation du phage P2 et d'un prophage defectifs d'Escherichia Coli B, par la methode de centrifugation en gradient de densite

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Denise [Commissariat a l' energie atomique et aux energies alternatives - CEA, C. E. N. de Saclay, Service de Biologie (France)

    1960-07-01

    The P2 Hydis phage produced by P2 phage multiplication in E. coli B shows a higher density than its P2 parent. This density increase is the same for all P2 Hydis coming from a huge number of distinct hybridizations. It is closed to 0.002 g.cm{sup -3}. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 946-948, sitting of 1 February 1960 [French] Le phage P2 Hydis, produit lors de la multiplication du phage P2 dans E. Coli B, presente une densite superieure a celle de son parent P2. Cette augmentation de densite est la meme pour tous les P2 Hydis issus d'un grand nombre d'hybridations distinctes. Elle est voisine de 0,002 g.cm{sup -3}. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 946-948, seance du 1er fevrier 1960.

  17. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  18. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  19. Inference in hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Langseth, Helge; Nielsen, Thomas D.; Rumi, Rafael; Salmeron, Antonio

    2009-01-01

    Since the 1980s, Bayesian networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability techniques (like fault trees and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (the so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability.

  20. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  1. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  2. SWORDS: A statistical tool for analysing large DNA sequences

    Indian Academy of Sciences (India)

    Unknown

    These techniques are based on frequency distributions of DNA words in a large sequence, and have been packaged into a software called SWORDS. Using sequences available in ... tions with the cellular processes like recombination, replication .... in DNA sequences using certain specific probability laws. (Pevzner et al ...

  3. Statistical methods for analysing responses of wildlife to human disturbance.

    Science.gov (United States)

    Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom

    2006-01-01

    1. Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To illustrate these methods, we studied the...

  4. Statistical analyses of local transport coefficients in Ohmic ASDEX discharges

    International Nuclear Information System (INIS)

    Simmet, E.; Stroth, U.; Wagner, F.; Fahrbach, H.U.; Herrmann, W.; Kardaun, O.J.W.F.; Mayer, H.M.

    1991-01-01

    Tokamak energy transport is still an unsolved problem. Many theoretical models have been developed, which try to explain the anomalous high energy-transport coefficients. Up to now these models have been applied to global plasma parameters. A comparison of transport coefficients with global confinement time is only conclusive if the transport is dominated by one process across the plasma diameter. This, however, is not the case in most Ohmic confinement regimes, where at least three different transport mechanisms play an important role. Sawtooth activity leads to an increase in energy transport in the plasma centre. In the intermediate region turbulent transport is expected. Candidates here are drift waves and resistive fluid turbulences. At the edge, ballooning modes or rippling modes could dominate the transport. For the intermediate region, one can deduce theoretical scaling laws for τ E from turbulent theories. Predicted scalings reproduce the experimentally found density dependence of τ E in the linear Ohmic confinement regime (LOC) and the saturated regime (SOC), but they do not show the correct dependence on the isotope mass. The relevance of these transport theories can only be tested in comparing them to experimental local transport coefficients. To this purpose we have performed transport calculations on more than a hundred Ohmic ASDEX discharges. By Principal Component Analysis we determine the dimensionless components which dominate the transport coefficients and we compare the results to the predictions of various theories. (author) 6 refs., 2 figs., 1 tab

  5. Statistical considerations for grain-size analyses of tills

    Science.gov (United States)

    Jacobs, A.M.

    1971-01-01

    Relative percentages of sand, silt, and clay from samples of the same till unit are not identical because of different lithologies in the source areas, sorting in transport, random variation, and experimental error. Random variation and experimental error can be isolated from the other two as follows. For each particle-size class of each till unit, a standard population is determined by using a normally distributed, representative group of data. New measurements are compared with the standard population and, if they compare satisfactorily, the experimental error is not significant and random variation is within the expected range for the population. The outcome of the comparison depends on numerical criteria derived from a graphical method rather than on a more commonly used one-way analysis of variance with two treatments. If the number of samples and the standard deviation of the standard population are substituted in a t-test equation, a family of hyperbolas is generated, each of which corresponds to a specific number of subsamples taken from each new sample. The axes of the graphs of the hyperbolas are the standard deviation of new measurements (horizontal axis) and the difference between the means of the new measurements and the standard population (vertical axis). The area between the two branches of each hyperbola corresponds to a satisfactory comparison between the new measurements and the standard population. Measurements from a new sample can be tested by plotting their standard deviation vs. difference in means on axes containing a hyperbola corresponding to the specific number of subsamples used. If the point lies between the branches of the hyperbola, the measurements are considered reliable. But if the point lies outside this region, the measurements are repeated. Because the critical segment of the hyperbola is approximately a straight line parallel to the horizontal axis, the test is simplified to a comparison between the means of the standard population and the means of the subsample. The minimum number of subsamples required to prove significant variation between samples caused by different lithologies in the source areas and sorting in transport can be determined directly from the graphical method. The minimum number of subsamples required is the maximum number to be run for economy of effort. ?? 1971 Plenum Publishing Corporation.

  6. Practical Statistics for Particle Physics Analyses: Likelihoods (1/4)

    CERN Multimedia

    CERN. Geneva; Lyons, Louis

    2016-01-01

    This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....

  7. Late neolithic pottery standardization: Application of statistical analyses

    Directory of Open Access Journals (Sweden)

    Vuković Jasna

    2011-01-01

    Full Text Available This paper defines the notion of standardization, presents the methodological approach to analysis, points to the problems and limitation arising in examination of materials from archaeological excavations, and presents the results of the analysis of coefficients of variation of metric parameters of the Late Neolithic vessels recovered at the sites of Vinča and Motel Slatina. [Projekat Ministarstva nauke Republike Srbije, br. 177012: Society, the spiritual and material culture and communications in prehistory and early history of the Balkans

  8. Statistical and regression analyses of detected extrasolar systems

    Czech Academy of Sciences Publication Activity Database

    Pintr, Pavel; Peřinová, V.; Lukš, A.; Pathak, A.

    2013-01-01

    Roč. 75, č. 1 (2013), s. 37-45 ISSN 0032-0633 Institutional support: RVO:61389021 Keywords : Exoplanets * Kepler candidates * Regression analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.630, year: 2013 http://www.sciencedirect.com/science/article/pii/S0032063312003066

  9. The Pace of Hybrid Incompatibility Evolution in House Mice.

    Science.gov (United States)

    Wang, Richard J; White, Michael A; Payseur, Bret A

    2015-09-01

    Hybrids between species are often sterile or inviable. This form of reproductive isolation is thought to evolve via the accumulation of mutations that interact to reduce fitness when combined in hybrids. Mathematical formulations of this "Dobzhansky-Muller model" predict an accelerating buildup of hybrid incompatibilities with divergence time (the "snowball effect"). Although the Dobzhansky-Muller model is widely accepted, the snowball effect has only been tested in two species groups. We evaluated evidence for the snowball effect in the evolution of hybrid male sterility among subspecies of house mice, a recently diverged group that shows partial reproductive isolation. We compared the history of subspecies divergence with patterns of quantitative trait loci (QTL) detected in F2 intercrosses between two pairs of subspecies (Mus musculus domesticus with M. m. musculus and M. m. domesticus with M. m. castaneus). We used a recently developed phylogenetic comparative method to statistically measure the fit of these data to the snowball prediction. To apply this method, QTL were partitioned as either shared or unshared in the two crosses. A heuristic partitioning based on the overlap of QTL confidence intervals produced unambiguous support for the snowball effect. An alternative approach combining data among crosses favored the snowball effect for the autosomes, but a linear accumulation of incompatibilities for the X chromosome. Reasoning that the X chromosome analyses are complicated by low mapping resolution, we conclude that hybrid male sterility loci have snowballed in house mice. Our study illustrates the power of comparative genetic mapping for understanding mechanisms of speciation. Copyright © 2015 by the Genetics Society of America.

  10. The Need for Speed in Rodent Locomotion Analyses

    Science.gov (United States)

    Batka, Richard J.; Brown, Todd J.; Mcmillan, Kathryn P.; Meadows, Rena M.; Jones, Kathryn J.; Haulcomb, Melissa M.

    2016-01-01

    Locomotion analysis is now widely used across many animal species to understand the motor defects in disease, functional recovery following neural injury, and the effectiveness of various treatments. More recently, rodent locomotion analysis has become an increasingly popular method in a diverse range of research. Speed is an inseparable aspect of locomotion that is still not fully understood, and its effects are often not properly incorporated while analyzing data. In this hybrid manuscript, we accomplish three things: (1) review the interaction between speed and locomotion variables in rodent studies, (2) comprehensively analyze the relationship between speed and 162 locomotion variables in a group of 16 wild-type mice using the CatWalk gait analysis system, and (3) develop and test a statistical method in which locomotion variables are analyzed and reported in the context of speed. Notable results include the following: (1) over 90% of variables, reported by CatWalk, were dependent on speed with an average R2 value of 0.624, (2) most variables were related to speed in a nonlinear manner, (3) current methods of controlling for speed are insufficient, and (4) the linear mixed model is an appropriate and effective statistical method for locomotion analyses that is inclusive of speed-dependent relationships. Given the pervasive dependency of locomotion variables on speed, we maintain that valid conclusions from locomotion analyses cannot be made unless they are analyzed and reported within the context of speed. PMID:24890845

  11. Multiple and asymmetrical origin of polyploid dog rose hybrids (Rosa L. sect. Caninae (DC.) Ser.) involving unreduced gametes.

    Science.gov (United States)

    Herklotz, V; Ritz, C M

    2017-08-01

    Polyploidy and hybridization are important factors for generating diversity in plants. The species-rich dog roses ( Rosa sect. Caninae ) originated by allopolyploidy and are characterized by unbalanced meiosis producing polyploid egg cells (usually 4 x ) and haploid sperm cells (1 x ). In extant natural stands species hybridize spontaneously, but the extent of natural hybridization is unknown. The aim of the study was to document the frequency of reciprocal hybridization between the subsections Rubigineae and Caninae with special reference to the contribution of unreduced egg cells (5 x ) producing 6 x offspring after fertilization with reduced (1 x ) sperm cells. We tested whether hybrids arose by independent multiple events or via a single or few incidences followed by a subsequent spread of hybrids. Population genetics of 45 mixed stands of dog roses across central and south-eastern Europe were analysed using microsatellite markers and flow cytometry. Hybrids were recognized by the presence of diagnostic alleles and multivariate statistics were used to display the relationships between parental species and hybrids. Among plants classified to subsect. Rubigineae , 32 % hybridogenic individuals were detected but only 8 % hybrids were found in plants assigned to subsect. Caninae . This bias between reciprocal crossings was accompanied by a higher ploidy level in Rubigineae hybrids, which originated more frequently by unreduced egg cells. Genetic patterns of hybrids were strongly geographically structured, supporting their independent origin. The biased crossing barriers between subsections are explained by the facilitated production of unreduced gametes in subsect. Rubigineae . Unreduced egg cells probably provide the highly homologous chromosome sets required for correct chromosome pairing in hybrids. Furthermore, the higher frequency of Rubigineae hybrids is probably influenced by abundance effects because the plants of subsect. Caninae are much more abundant

  12. Hybrid Qualifications

    DEFF Research Database (Denmark)

    Against the background of increasing qualification needs there is a growing awareness of the challenge to widen participation in processes of skill formation and competence development. At the same time, the issue of permeability between vocational education and training (VET) and general education...... has turned out as a major focus of European education and training policies and certainly is a crucial principle underlying the European Qualifications Framework (EQF). In this context, «hybrid qualifications» (HQ) may be seen as an interesting approach to tackle these challenges as they serve «two...

  13. Hybrid Gear

    Science.gov (United States)

    Handschuh, Robert F. (Inventor); Roberts, Gary D. (Inventor)

    2016-01-01

    A hybrid gear consisting of metallic outer rim with gear teeth and metallic hub in combination with a composite lay up between the shaft interface (hub) and gear tooth rim is described. The composite lay-up lightens the gear member while having similar torque carrying capability and it attenuates the impact loading driven noise/vibration that is typical in gear systems. The gear has the same operational capability with respect to shaft speed, torque, and temperature as an all-metallic gear as used in aerospace gear design.

  14. Reverse hybrid total hip arthroplasty

    DEFF Research Database (Denmark)

    Wangen, Helge; Havelin, Leif I.; Fenstad, Anne M

    2017-01-01

    Background and purpose - The use of a cemented cup together with an uncemented stem in total hip arthroplasty (THA) has become popular in Norway and Sweden during the last decade. The results of this prosthetic concept, reverse hybrid THA, have been sparsely described. The Nordic Arthroplasty....... Patients and methods - From the NARA, we extracted data on reverse hybrid THAs from January 1, 2000 until December 31, 2013. 38,415 such hips were studied and compared with cemented THAs. The Kaplan-Meier method and Cox regression analyses were used to estimate the prosthesis survival and the relative risk...

  15. Craniomandibular form and body size variation of first generation mouse hybrids: A model for hominin hybridization.

    Science.gov (United States)

    Warren, Kerryn A; Ritzman, Terrence B; Humphreys, Robyn A; Percival, Christopher J; Hallgrímsson, Benedikt; Ackermann, Rebecca Rogers

    2018-03-01

    Hybridization occurs in a number of mammalian lineages, including among primate taxa. Analyses of ancient genomes have shown that hybridization between our lineage and other archaic hominins in Eurasia occurred numerous times in the past. However, we still have limited empirical data on what a hybrid skeleton looks like, or how to spot patterns of hybridization among fossils for which there are no genetic data. Here we use experimental mouse models to supplement previous studies of primates. We characterize size and shape variation in the cranium and mandible of three wild-derived inbred mouse strains and their first generation (F 1 ) hybrids. The three parent taxa in our analysis represent lineages that diverged over approximately the same period as the human/Neanderthal/Denisovan lineages and their hybrids are variably successful in the wild. Comparisons of body size, as quantified by long bone measurements, are also presented to determine whether the identified phenotypic effects of hybridization are localized to the cranium or represent overall body size changes. The results indicate that hybrid cranial and mandibular sizes, as well as limb length, exceed that of the parent taxa in all cases. All three F 1 hybrid crosses display similar patterns of size and form variation. These results are generally consistent with earlier studies on primates and other mammals, suggesting that the effects of hybridization may be similar across very different scenarios of hybridization, including different levels of hybrid fitness. This paper serves to supplement previous studies aimed at identifying F 1 hybrids in the fossil record and to introduce further research that will explore hybrid morphologies using mice as a proxy for better understanding hybridization in the hominin fossil record. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Intuitionistic hybrid logic

    DEFF Research Database (Denmark)

    Braüner, Torben

    2011-01-01

    Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area.......Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area....

  17. Improved signal processing approaches in an offline simulation of a hybrid brain–computer interface

    Science.gov (United States)

    Brunner, Clemens; Allison, Brendan Z.; Krusienski, Dean J.; Kaiser, Vera; Müller-Putz, Gernot R.; Pfurtscheller, Gert; Neuper, Christa

    2012-01-01

    In a conventional brain–computer interface (BCI) system, users perform mental tasks that yield specific patterns of brain activity. A pattern recognition system determines which brain activity pattern a user is producing and thereby infers the user’s mental task, allowing users to send messages or commands through brain activity alone. Unfortunately, despite extensive research to improve classification accuracy, BCIs almost always exhibit errors, which are sometimes so severe that effective communication is impossible. We recently introduced a new idea to improve accuracy, especially for users with poor performance. In an offline simulation of a “hybrid” BCI, subjects performed two mental tasks independently and then simultaneously. This hybrid BCI could use two different types of brain signals common in BCIs – event-related desynchronization (ERD) and steady-state evoked potentials (SSEPs). This study suggested that such a hybrid BCI is feasible. Here, we re-analyzed the data from our initial study. We explored eight different signal processing methods that aimed to improve classification and further assess both the causes and the extent of the benefits of the hybrid condition. Most analyses showed that the improved methods described here yielded a statistically significant improvement over our initial study. Some of these improvements could be relevant to conventional BCIs as well. Moreover, the number of illiterates could be reduced with the hybrid condition. Results are also discussed in terms of dual task interference and relevance to protocol design in hybrid BCIs. PMID:20153371

  18. Doing statistical mediation and moderation

    CERN Document Server

    Jose, Paul E

    2013-01-01

    Written in a friendly, conversational style, this book offers a hands-on approach to statistical mediation and moderation for both beginning researchers and those familiar with modeling. Starting with a gentle review of regression-based analysis, Paul Jose covers basic mediation and moderation techniques before moving on to advanced topics in multilevel modeling, structural equation modeling, and hybrid combinations, such as moderated mediation. User-friendly features include numerous graphs and carefully worked-through examples; ""Helpful Suggestions"" about procedures and pitfalls; ""Knowled

  19. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  20. Hybridized Tetraquarks

    CERN Document Server

    Esposito, A.; Polosa, A.D.

    2016-01-01

    We propose a new interpretation of the neutral and charged X, Z exotic hadron resonances. Hybridized-tetraquarks are neither purely compact tetraquark states nor bound or loosely bound molecules. The latter would require a negative or zero binding energy whose counterpart in h-tetraquarks is a positive quantity. The formation mechanism of this new class of hadrons is inspired by that of Feshbach metastable states in atomic physics. The recent claim of an exotic resonance in the Bs pi+- channel by the D0 collaboration and the negative result presented subsequently by the LHCb collaboration are understood in this scheme, together with a considerable portion of available data on X, Z particles. Considerations on a state with the same quantum numbers as the X(5568) are also made.

  1. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  2. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  3. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  4. Continuity controlled Hybrid Automata

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation

  5. Continuity Controlled Hybrid Automata

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2004-01-01

    We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of

  6. Continuity controlled hybrid automata

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2004-01-01

    We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of

  7. Continuity controlled hybrid automata

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2006-01-01

    We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of

  8. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  9. A Hybrid Approach to Protect Palmprint Templates

    Directory of Open Access Journals (Sweden)

    Hailun Liu

    2014-01-01

    Full Text Available Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach.

  10. Event tree analysis for the system of hybrid reactor

    International Nuclear Information System (INIS)

    Yang Yongwei; Qiu Lijian

    1993-01-01

    The application of probabilistic risk assessment for fusion-fission hybrid reactor is introduced. A hybrid reactor system has been analysed using event trees. According to the character of the conceptual design of Hefei Fusion-fission Experimental Hybrid Breeding Reactor, the probabilities of the event tree series induced by 4 typical initiating events were calculated. The results showed that the conceptual design is safe and reasonable. through this paper, the safety character of hybrid reactor system has been understood more deeply. Some suggestions valuable to safety design for hybrid reactor have been proposed

  11. Corporate Hybrid Bonds

    OpenAIRE

    Ahlberg, Johan; Jansson, Anton

    2016-01-01

    Hybrid securities do not constitute a new phenomenon in the Swedish capital markets. Most commonly, hybrids issued by Swedish real estate companies in recent years are preference shares. Corporate hybrid bonds on the other hand may be considered as somewhat of a new-born child in the family of hybrid instruments. These do, as all other hybrid securities, share some equity-like and some debt-like characteristics. Nevertheless, since 2013 the interest for the instrument has grown rapidly and ha...

  12. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  13. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  14. Hybrid XRF

    International Nuclear Information System (INIS)

    Heckel, J.

    2002-01-01

    Full text: In the last 10 years significant innovations of EDXRF, e.g. total reflection XRF or polarized beam XRF, were utilized in different industrial applications. The decrease of background within the spectra was the goal of these developments. Excellent detection limits and sensitivities demonstrate the success of these new techniques. Nevertheless, further improvements are possible by using Si drift detectors. These detectors allow the processing of input count rates up to 10 6 cps in comparison to 10 5 of Si(Li) detectors. New excitation optics are necessary to produce such count rates. One possibility is the use of doubly curved crystals between tube and sample. These crystals enable the reflection of the primary beam within the given solid angle (0.4π) of an end window tube to the sample. Using such brightness optics excellent sensitivities mainly for light elements are achievable. The combination of a BRAGG crystal as a wavelength dispersive component and a solid state detector as an energy dispersive component creates a new technique: hybrid XRF. Copyright (2002) Australian X-ray Analytical Association Inc. Copyright (2002) Australian X-ray Analytical Association Inc

  15. Assessment on Hybrid E-Learning Instrument

    OpenAIRE

    Intan Farahana Kamsin; Rosseni Din

    2015-01-01

    This study aims to improve Hybrid e-Learning 9.3. A total of 233 students of International Islamic University Malaysia, Gombak who have the experience in hybrid teaching and learning were involved as respondents. Rasch Measurement Model was used for this study. Validity analyses conducted were on (i) the compatibility of the items, (ii) mapping of items and respondents, (iii) scaling of instruments, and (iv) unidimentional items. The findings of the study show that (i) the items developed cor...

  16. Hybrid mimics and hybrid vigor in Arabidopsis

    Science.gov (United States)

    Wang, Li; Greaves, Ian K.; Groszmann, Michael; Wu, Li Min; Dennis, Elizabeth S.; Peacock, W. James

    2015-01-01

    F1 hybrids can outperform their parents in yield and vegetative biomass, features of hybrid vigor that form the basis of the hybrid seed industry. The yield advantage of the F1 is lost in the F2 and subsequent generations. In Arabidopsis, from F2 plants that have a F1-like phenotype, we have by recurrent selection produced pure breeding F5/F6 lines, hybrid mimics, in which the characteristics of the F1 hybrid are stabilized. These hybrid mimic lines, like the F1 hybrid, have larger leaves than the parent plant, and the leaves have increased photosynthetic cell numbers, and in some lines, increased size of cells, suggesting an increased supply of photosynthate. A comparison of the differentially expressed genes in the F1 hybrid with those of eight hybrid mimic lines identified metabolic pathways altered in both; these pathways include down-regulation of defense response pathways and altered abiotic response pathways. F6 hybrid mimic lines are mostly homozygous at each locus in the genome and yet retain the large F1-like phenotype. Many alleles in the F6 plants, when they are homozygous, have expression levels different to the level in the parent. We consider this altered expression to be a consequence of transregulation of genes from one parent by genes from the other parent. Transregulation could also arise from epigenetic modifications in the F1. The pure breeding hybrid mimics have been valuable in probing the mechanisms of hybrid vigor and may also prove to be useful hybrid vigor equivalents in agriculture. PMID:26283378

  17. Entropy statistics and information theory

    NARCIS (Netherlands)

    Frenken, K.; Hanusch, H.; Pyka, A.

    2007-01-01

    Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the

  18. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  19. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  20. Computational hybrid anthropometric paediatric phantom library for internal radiation dosimetry

    Science.gov (United States)

    Xie, Tianwu; Kuster, Niels; Zaidi, Habib

    2017-04-01

    Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.

  1. ADAPTING HYBRID MACHINE TRANSLATION TECHNIQUES FOR CROSS-LANGUAGE TEXT RETRIEVAL SYSTEM

    Directory of Open Access Journals (Sweden)

    P. ISWARYA

    2017-03-01

    Full Text Available This research work aims in developing Tamil to English Cross - language text retrieval system using hybrid machine translation approach. The hybrid machine translation system is a combination of rule based and statistical based approaches. In an existing word by word translation system there are lot of issues and some of them are ambiguity, Out-of-Vocabulary words, word inflections, and improper sentence structure. To handle these issues, proposed architecture is designed in such a way that, it contains Improved Part-of-Speech tagger, machine learning based morphological analyser, collocation based word sense disambiguation procedure, semantic dictionary, and tense markers with gerund ending rules, and two pass transliteration algorithm. From the experimental results it is clear that the proposed Tamil Query based translation system achieves significantly better translation quality over existing system, and reaches 95.88% of monolingual performance.

  2. Stationary magnetic shear reversal during Lower Hybrid experiments in Tore Supra

    International Nuclear Information System (INIS)

    Litaudon, X.; Arslanbekov, R.; Hoang, G.T.; Joffrin, E.; Kazarian-Vibert, F.; Moreau, D.; Peysson, Y.; Bibet, P.

    1996-01-01

    Stable and stationary states with hollow current density profiles have been achieved with Lower Hybrid Current Drive (LHCD) during Lower Hybrid (LH) wave accessibility experiments. By analysing the bounded propagation domain in phase space which naturally limits the central penetration and absorption of the waves, off-axis LH power deposition has been realized in a reproducible manner. The resulting current density profile modifications have led to a global confinement enhancement attributed to the formation of an internal 'transport barrier' in the central reversed shear region where the electron thermal diffusivity is reduced to its neoclassical collisional level. The multiple-pass LH wave propagation in the weak Landau damping and reversed magnetic shear regime is also investigated in the framework of a statistical theory and the experimental validation of this theory is discussed. (author)

  3. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  4. Truths, lies, and statistics.

    Science.gov (United States)

    Thiese, Matthew S; Walker, Skyler; Lindsey, Jenna

    2017-10-01

    Distribution of valuable research discoveries are needed for the continual advancement of patient care. Publication and subsequent reliance of false study results would be detrimental for patient care. Unfortunately, research misconduct may originate from many sources. While there is evidence of ongoing research misconduct in all it's forms, it is challenging to identify the actual occurrence of research misconduct, which is especially true for misconduct in clinical trials. Research misconduct is challenging to measure and there are few studies reporting the prevalence or underlying causes of research misconduct among biomedical researchers. Reported prevalence estimates of misconduct are probably underestimates, and range from 0.3% to 4.9%. There have been efforts to measure the prevalence of research misconduct; however, the relatively few published studies are not freely comparable because of varying characterizations of research misconduct and the methods used for data collection. There are some signs which may point to an increased possibility of research misconduct, however there is a need for continued self-policing by biomedical researchers. There are existing resources to assist in ensuring appropriate statistical methods and preventing other types of research fraud. These included the "Statistical Analyses and Methods in the Published Literature", also known as the SAMPL guidelines, which help scientists determine the appropriate method of reporting various statistical methods; the "Strengthening Analytical Thinking for Observational Studies", or the STRATOS, which emphases on execution and interpretation of results; and the Committee on Publication Ethics (COPE), which was created in 1997 to deliver guidance about publication ethics. COPE has a sequence of views and strategies grounded in the values of honesty and accuracy.

  5. Isotopic safeguards statistics

    International Nuclear Information System (INIS)

    Timmerman, C.L.; Stewart, K.B.

    1978-06-01

    The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results

  6. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  7. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  8. Hybrid mesons with auxiliary fields

    International Nuclear Information System (INIS)

    Buisseret, F.; Mathieu, V.

    2006-01-01

    Hybrid mesons are exotic mesons in which the color field is not in the ground state. Their understanding deserves interest from a theoretical point of view, because it is intimately related to nonperturbative aspects of QCD. Moreover, it seems that some recently detected particles, such as the π 1 (1600) and the Y(4260), are serious hybrid candidates. In this work, we investigate the description of such exotic hadrons by applying the auxiliary fields technique (also known as the einbein field method) to the widely used spinless Salpeter Hamiltonian with appropriate linear confinement. Instead of the usual numerical resolution, this technique allows to find simplified analytical mass spectra and wave functions of the Hamiltonian, which still lead to reliable qualitative predictions. We analyse and compare two different descriptions of hybrid mesons, namely a two-body q system with an excited flux tube, or a three-body qg system. We also compute the masses of the 1 -+ hybrids. Our results are shown to be in satisfactory agreement with lattice QCD and other effective models. (orig.)

  9. Remote-sensing image encryption in hybrid domains

    Science.gov (United States)

    Zhang, Xiaoqiang; Zhu, Guiliang; Ma, Shilong

    2012-04-01

    Remote-sensing technology plays an important role in military and industrial fields. Remote-sensing image is the main means of acquiring information from satellites, which always contain some confidential information. To securely transmit and store remote-sensing images, we propose a new image encryption algorithm in hybrid domains. This algorithm makes full use of the advantages of image encryption in both spatial domain and transform domain. First, the low-pass subband coefficients of image DWT (discrete wavelet transform) decomposition are sorted by a PWLCM system in transform domain. Second, the image after IDWT (inverse discrete wavelet transform) reconstruction is diffused with 2D (two-dimensional) Logistic map and XOR operation in spatial domain. The experiment results and algorithm analyses show that the new algorithm possesses a large key space and can resist brute-force, statistical and differential attacks. Meanwhile, the proposed algorithm has the desirable encryption efficiency to satisfy requirements in practice.

  10. Hybrid Management in Hospitals

    DEFF Research Database (Denmark)

    Byrkjeflot, Haldor; Jespersen, Peter Kragh

    2010-01-01

    Artiklen indeholder et litteraturbaseret studium af ledelsesformer i sygehuse, hvor sundhedsfaglig ledelse og generel ledelse mikses til hybride ledelsesformer......Artiklen indeholder et litteraturbaseret studium af ledelsesformer i sygehuse, hvor sundhedsfaglig ledelse og generel ledelse mikses til hybride ledelsesformer...

  11. Hydraulic Hybrid Vehicles

    Science.gov (United States)

    EPA and the United Parcel Service (UPS) have developed a hydraulic hybrid delivery vehicle to explore and demonstrate the environmental benefits of the hydraulic hybrid for urban pick-up and delivery fleets.

  12. Mesoscale hybrid calibration artifact

    Science.gov (United States)

    Tran, Hy D.; Claudet, Andre A.; Oliver, Andrew D.

    2010-09-07

    A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.

  13. Scalar field dark matter in hybrid approach

    NARCIS (Netherlands)

    Friedrich, Pavel; Prokopec, Tomislav

    2017-01-01

    We develop a hybrid formalism suitable for modeling scalar field dark matter, in which the phase-space distribution associated to the real scalar field is modeled by statistical equal-time two-point functions and gravity is treated by two stochastic gravitational fields in the longitudinal gauge (in

  14. Stochastic hybrid systems with renewal transitions

    NARCIS (Netherlands)

    Guerreiro Tome Antunes, D.J.; Hespanha, J.P.; Silvestre, C.J.

    2010-01-01

    We consider Stochastic Hybrid Systems (SHSs) for which the lengths of times that the system stays in each mode are independent random variables with given distributions. We propose an analysis framework based on a set of Volterra renewal-type equations, which allows us to compute any statistical

  15. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  16. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  17. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  18. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  19. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  20. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  1. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  2. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  3. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  4. Hybrid quantum information processing

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  5. Multiple Intelligences in Online, Hybrid, and Traditional Business Statistics Courses

    Science.gov (United States)

    Lopez, Salvador; Patron, Hilde

    2012-01-01

    According to Howard Gardner, Professor of Cognition and Education at Harvard University, intelligence of humans cannot be measured with a single factor such as the IQ level. Instead, he and others have suggested that humans have different types of intelligence. This paper examines whether students registered in online or mostly online courses have…

  6. Statistics for NAEG: past efforts, new results, and future plans

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.; Engel, D.W.

    1983-06-01

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given

  7. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  8. Usage statistics and demonstrator services

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    An understanding of the use of repositories and their contents is clearly desirable for authors and repository managers alike, as well as those who are analysing the state of scholarly communications. A number of individual initiatives have produced statistics of variious kinds for individual repositories, but the real challenge is to produce statistics that can be collected and compared transparently on a global scale. This presentation details the steps to be taken to address the issues to attain this capability View Les Carr's biography

  9. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  10. Hybrid fuel cells technologies for electrical microgrids

    Energy Technology Data Exchange (ETDEWEB)

    San Martin, Jose Ignacio; Zamora, Inmaculada; San Martin, Jose Javier; Aperribay, Victor; Eguia, Pablo [Department of Electrical Engineering, University of the Basque Country, Alda. de Urquijo, s/n, 48013 Bilbao (Spain)

    2010-09-15

    Hybrid systems are characterized by containing two or more electrical generation technologies, in order to optimize the global efficiency of the processes involved. These systems can present different operating modes. Besides, they take into account aspects that not only concern the electrical and thermal efficiencies, but also the reduction of pollutant emissions. There is a wide range of possible configurations to form hybrid systems, including hydrogen, renewable energies, gas cycles, vapour cycles or both. Nowadays, these technologies are mainly used for energy production in electrical microgrids. Some examples of these technologies are: hybridization processes of fuel cells with wind turbines and photovoltaic plants, cogeneration and trigeneration processes that can be configured with fuel cell technologies, etc. This paper reviews and analyses the main characteristics of electrical microgrids and the systems based on fuel cells for polygeneration and hybridization processes. (author)

  11. variability of in vitro and phenological behaviours of cocoa hybrids

    African Journals Online (AJOL)

    ACSS

    analyse the variability of the in vitro and phenological behaviours of 6 cocoa ... The 4 aforementioned hybrids could be used to produce cocoa aroma, ... hybrids using a multivariate approach. .... 3 clusters and variables was assessed through ... function, and (iv) analysis of the representation quality. Thus, the number of ...

  12. Treatment of Markup in Statistical Machine Translation

    OpenAIRE

    Müller, Mathias

    2017-01-01

    We present work on handling XML markup in Statistical Machine Translation (SMT). The methods we propose can be used to effectively preserve markup (for instance inline formatting or structure) and to place markup correctly in a machine-translated segment. We evaluate our approaches with parallel data that naturally contains markup or where markup was inserted to create synthetic examples. In our experiments, hybrid reinsertion has proven the most accurate method to handle markup, while alignm...

  13. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  14. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  15. Hybrid Model of Content Extraction

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah

    2012-01-01

    We present a hybrid model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features like link density and text distribution across the node to predict...... significance of the node towards overall content provided by the document. Once significance of the nodes is determined, the formatting characteristics like fonts, styles and the position of the nodes are evaluated to identify the nodes with similar formatting as compared to the significant nodes. The proposed...

  16. Practical Statistics for Environmental and Biological Scientists

    CERN Document Server

    Townend, John

    2012-01-01

    All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp

  17. Comparison of Fluorescence In Situ Hybridization and Chromogenic In Situ Hybridization for Low and High Throughput HER2 Genetic Testing

    DEFF Research Database (Denmark)

    Poulsen, Tim S; Espersen, Maiken Lise Marcker; Kofoed, Vibeke

    2013-01-01

    cancer patients with HER2 immunohistochemistry (IHC) results scored as 0/1+, 2+, and 3+. HER2 genetic status was analysed using chromogenic in situ hybridization (CISH) and fluorescence in situ hybridization (FISH). Scoring results were documented through digital image analysis. The cancer region...

  18. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  19. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  20. Marine Fish Hybridization

    KAUST Repository

    He, Song

    2017-04-01

    Natural hybridization is reproduction (without artificial influence) between two or more species/populations which are distinguishable from each other by heritable characters. Natural hybridizations among marine fishes were highly underappreciated due to limited research effort; it seems that this phenomenon occurs more often than is commonly recognized. As hybridization plays an important role in biodiversity processes in the marine environment, detecting hybridization events and investigating hybridization is important to understand and protect biodiversity. The first chapter sets the framework for this disseration study. The Cohesion Species Concept was selected as the working definition of a species for this study as it can handle marine fish hybridization events. The concept does not require restrictive species boundaries. A general history and background of natural hybridization in marine fishes is reviewed during in chapter as well. Four marine fish hybridization cases were examed and documented in Chapters 2 to 5. In each case study, at least one diagnostic nuclear marker, screened from among ~14 candidate markers, was found to discriminate the putative hybridizing parent species. To further investigate genetic evidence to support the hybrid status for each hybrid offspring in each case, haploweb analysis on diagnostic markers (nuclear and/or mitochondrial) and the DAPC/PCA analysis on microsatellite data were used. By combining the genetic evidences, morphological traits, and ecological observations together, the potential reasons that triggered each hybridization events and the potential genetic/ecology effects could be discussed. In the last chapter, sequences from 82 pairs of hybridizing parents species (for which COI barcoding sequences were available either on GenBank or in our lab) were collected. By comparing the COI fragment p-distance between each hybridizing parent species, some general questions about marine fish hybridization were discussed: Is

  1. Modern applied statistics with S-plus

    CERN Document Server

    Venables, W N

    1994-01-01

    S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.

  2. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  3. Statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.

    1975-10-01

    This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described

  4. Economic investigations of short rotation intensively cultured hybrid poplars

    Science.gov (United States)

    David C. Lothner

    1983-01-01

    The history of the economic analyses is summarized for short rotation intensively cultured hybrid poplar at the North Central Forest Experiment Station. Early break-even analyses with limited data indicated that at a price of $25-30 per dry ton for fiber and low to medium production costs, several systems looked profitable. Later cash flow analyses indicated that two...

  5. Statistical Literacy in the Data Science Workplace

    Science.gov (United States)

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  6. Propagation of singularities for linearised hybrid data impedance tomography

    DEFF Research Database (Denmark)

    Bal, Guillaume; Hoffmann, Kristoffer; Knudsen, Kim

    2017-01-01

    For a general formulation of linearised hybrid inverse problems in impedance tomography, the qualitative properties of the solutions are analysed. Using an appropriate scalar pseudo-differential formulation, the problems are shown to permit propagating singularities under certain non-elliptic con......For a general formulation of linearised hybrid inverse problems in impedance tomography, the qualitative properties of the solutions are analysed. Using an appropriate scalar pseudo-differential formulation, the problems are shown to permit propagating singularities under certain non...

  7. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  8. Simple Y-Autosomal Incompatibilities Cause Hybrid Male Sterility in Reciprocal Crosses Between Drosophila virilis and D. americana

    OpenAIRE

    Sweigart, Andrea L.

    2010-01-01

    Postzygotic reproductive isolation evolves when hybrid incompatibilities accumulate between diverging populations. Here, I examine the genetic basis of hybrid male sterility between two species of Drosophila, Drosophila virilis and D. americana. From these analyses, I reach several conclusions. First, neither species carries any autosomal dominant hybrid male sterility alleles: reciprocal F1 hybrid males are perfectly fertile. Second, later generation (backcross and F2) hybrid male sterility ...

  9. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  10. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  11. Investigation of Υ Dor - δ Sct hybrid stars based on high precission space photometry and complementary ground based spectroscopy

    International Nuclear Information System (INIS)

    Hareter, M.

    2013-01-01

    Stellar pulsation carries information on the physical condition within the star. While pressure modes (p modes) probe the outer layers of a star, gravity modes (g modes) penetrate deep into the radiative zone and thus carry valuable information on the physical conditions there. gamma Dor stars are stars that pulsate in such modes, apart from white dwarfs and slowly pulsating B (SPB) stars. Therefore, these stars are important test benches for stellar evolution and pulsation theory. delta Sct - gamma Dor hybrids are stars that pulsate like gamma Dor stars with g modes but also with p modes as the delta Sct stars do. This makes them even more suited for asteroseismology. The CoRoT long runs offer a great opportunity to analyse a large sample of stars observed homogeneously, uninterrupted and long time base of about 150 days, which is practically unachievable with ground based observation. Since space missions avoid the scintillation caused by the Earth's atmosphere, they allow to detect stellar oscillations on a sub-millimagnitude level even for stars as faint as 15th magnitude. The photometric data is supplemented by AAOmega classification spectroscopy, allowing to determine effective tem- peratures and surface gravity. With these data a statistical approach was adopted to describe the pulsation behaviour gamma Dor and delta Sct - gamma Dor hybrid stars. A temperature - period relation was found for gamma Dor and delta Sct stars, but none for delta Sct - gamma Dor hybrid stars, when considering their strongest g mode or p mode, respectively. The instability domain of hybrid stars is equal to that of delta Sct stars and is not con- fined to the overlapping region of the delta Sct and gamma Dor IS in the Hertzsprung- Russell diagram. Hybrid stars behave differently in the g mode regime than gamma Dor stars, which poses a serious question on how to define properly a delta Sct - gamma Dor hybrid. The convective flux blocking mechanism is supposed to work for stars

  12. New lager yeast strains generated by interspecific hybridization.

    Science.gov (United States)

    Krogerus, Kristoffer; Magalhães, Frederico; Vidgren, Virve; Gibson, Brian

    2015-05-01

    The interspecific hybrid Saccharomyces pastorianus is the most commonly used yeast in brewery fermentations worldwide. Here, we generated de novo lager yeast hybrids by mating a domesticated and strongly flocculent Saccharomyces cerevisiae ale strain with the Saccharomyces eubayanus type strain. The hybrids were characterized with respect to the parent strains in a wort fermentation performed at temperatures typical for lager brewing (12 °C). The resulting beers were analysed for sugar and aroma compounds, while the yeasts were tested for their flocculation ability and α-glucoside transport capability. These hybrids inherited beneficial properties from both parent strains (cryotolerance, maltotriose utilization and strong flocculation) and showed apparent hybrid vigour, fermenting faster and producing beer with higher alcohol content (5.6 vs 4.5 % ABV) than the parents. Results suggest that interspecific hybridization is suitable for production of novel non-GM lager yeast strains with unique properties and will help in elucidating the evolutionary history of industrial lager yeast.

  13. First-Generation Transgenic Plants and Statistics

    NARCIS (Netherlands)

    Nap, Jan-Peter; Keizer, Paul; Jansen, Ritsert

    1993-01-01

    The statistical analyses of populations of first-generation transgenic plants are commonly based on mean and variance and generally require a test of normality. Since in many cases the assumptions of normality are not met, analyses can result in erroneous conclusions. Transformation of data to

  14. Henkin and Hybrid Logic

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Huertas, Antonia; Manzano, Maria

    2014-01-01

    Leon Henkin was not a modal logician, but there is a branch of modal logic that has been deeply influenced by his work. That branch is hybrid logic, a family of logics that extend orthodox modal logic with special proposition symbols (called nominals) that name worlds. This paper explains why...... Henkin’s techniques are so important in hybrid logic. We do so by proving a completeness result for a hybrid type theory called HTT, probably the strongest hybrid logic that has yet been explored. Our completeness result builds on earlier work with a system called BHTT, or basic hybrid type theory...... is due to the first-order perspective, which lies at the heart of Henin’s best known work and hybrid logic....

  15. Hybrid rail gun electromagnetic accelerators

    International Nuclear Information System (INIS)

    Chen, K.W.; Hachen, H.; Lee, A.; Legh, G.; Lin, T.; Mattay, S.; Wipf, S.

    1983-01-01

    Theoretical and experimental investigations on hybrid rail accelerators are presented. It is shown that the side surface areas and in some cases sabots of the projectile can be used to provide substantial amount of additional thrust. Moreover, it is shown that in most cases examined, external magnetic fields can be conveniently incorporated in the accelerator designs to supplement the rail-induced fields. Total thrusts in excess of 10 MN for kilogram-sized projectiles can in principle be established with driving currents of the order of 1 MA. No obvious stress limitations are foreseen. The percentages of thrust from external magnetic fields are sufficiently high that the use of which should be encouraged. The increased flexibility in the projectile shapes available permits the use of the proposed hybrid electromagnetic launcher technology in a variety of new areas, such as thrust boosts in conventional chemical rockets and other similar applications. Furthermore, the additional thrust obtained from the use of side surface areas greatly increases the maximum permissable thrust otherwise limited by material strength considerations. Thrust analyses for projectiles in several hybrid rail accelerator designs are discussed. Some laboratory experimental observations are presented

  16. 47 CFR 1.363 - Introduction of statistical data.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall be...

  17. Hybrid Action Systems

    DEFF Research Database (Denmark)

    Ronkko, Mauno; Ravn, Anders P.

    1997-01-01

    a differential action, which allows differential equations as primitive actions. The extension allows us to model hybrid systems with both continuous and discrete behaviour. The main result of this paper is an extension of such a hybrid action system with parallel composition. The extension does not change...... the original meaning of the parallel composition, and therefore also the ordinary action systems can be composed in parallel with the hybrid action systems....

  18. Nanoscale Organic Hybrid Electrolytes

    KAUST Repository

    Nugent, Jennifer L.

    2010-08-20

    Nanoscale organic hybrid electrolytes are composed of organic-inorganic hybrid nanostructures, each with a metal oxide or metallic nanoparticle core densely grafted with an ion-conducting polyethylene glycol corona - doped with lithium salt. These materials form novel solvent-free hybrid electrolytes that are particle-rich, soft glasses at room temperature; yet manifest high ionic conductivity and good electrochemical stability above 5V. © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Nanoscale Organic Hybrid Electrolytes

    KAUST Repository

    Nugent, Jennifer L.; Moganty, Surya S.; Archer, Lynden A.

    2010-01-01

    Nanoscale organic hybrid electrolytes are composed of organic-inorganic hybrid nanostructures, each with a metal oxide or metallic nanoparticle core densely grafted with an ion-conducting polyethylene glycol corona - doped with lithium salt. These materials form novel solvent-free hybrid electrolytes that are particle-rich, soft glasses at room temperature; yet manifest high ionic conductivity and good electrochemical stability above 5V. © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. HYBRID VEHICLE CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    V. Dvadnenko

    2016-06-01

    Full Text Available The hybrid vehicle control system includes a start–stop system for an internal combustion engine. The system works in a hybrid mode and normal vehicle operation. To simplify the start–stop system, there were user new possibilities of a hybrid car, which appeared after the conversion. Results of the circuit design of the proposed system of basic blocks are analyzed.

  1. Hybrid radiator cooling system

    Science.gov (United States)

    France, David M.; Smith, David S.; Yu, Wenhua; Routbort, Jules L.

    2016-03-15

    A method and hybrid radiator-cooling apparatus for implementing enhanced radiator-cooling are provided. The hybrid radiator-cooling apparatus includes an air-side finned surface for air cooling; an elongated vertically extending surface extending outwardly from the air-side finned surface on a downstream air-side of the hybrid radiator; and a water supply for selectively providing evaporative cooling with water flow by gravity on the elongated vertically extending surface.

  2. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  3. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  4. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  5. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  6. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  7. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  8. Tuberculosis Data and Statistics

    Science.gov (United States)

    ... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...

  9. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  10. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  11. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  12. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  13. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  14. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  15. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  16. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  17. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  18. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  19. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  20. Toronto hybrid taxi pilot

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, M. [CrossChasm Technologies, Cambridge, ON (Canada); Marans, B. [Toronto Atmospheric Fund, ON (Canada)

    2009-10-15

    This paper provided details of a hybrid taxi pilot program conducted to compare the on-road performance of Toyota Camry hybrid vehicles against conventional vehicles over a 1-year period in order to determine the business case and air emission reductions associated with the use of hybrid taxi cabs. Over 750,000 km worth of fuel consumption was captured from 10 Toyota Camry hybrids, a Toyota Prius, and 5 non-hybrid Camry vehicles over an 18-month period. The average real world fuel consumption for the taxis demonstrated that the Toyota Prius has the lowest cost of ownership, while the non-hybrid Camry has the highest cost of ownership. Carbon dioxide (CO{sub 2}) reductions associated with the 10 Camry hybrid taxis were calculated at 236 tonnes over a 7-year taxi service life. Results suggested that the conversion of Toronto's 5680 taxis would yield annual CO{sub 2} emission reductions of over 19,000 tonnes. All hybrid purchasers identified themselves as highly likely to purchase a hybrid again. 5 tabs., 9 figs.

  1. Managing hybrid marketing systems.

    Science.gov (United States)

    Moriarty, R T; Moran, U

    1990-01-01

    As competition increases and costs become critical, companies that once went to market only one way are adding new channels and using new methods - creating hybrid marketing systems. These hybrid marketing systems hold the promise of greater coverage and reduced costs. But they are also hard to manage; they inevitably raise questions of conflict and control: conflict because marketing units compete for customers; control because new indirect channels are less subject to management authority. Hard as they are to manage, however, hybrid marketing systems promise to become the dominant design, replacing the "purebred" channel strategy in all kinds of businesses. The trick to managing the hybrid is to analyze tasks and channels within and across a marketing system. A map - the hybrid grid - can help managers make sense of their hybrid system. What the chart reveals is that channels are not the basic building blocks of a marketing system; marketing tasks are. The hybrid grid forces managers to consider various combinations of channels and tasks that will optimize both cost and coverage. Managing conflict is also an important element of a successful hybrid system. Managers should first acknowledge the inevitability of conflict. Then they should move to bound it by creating guidelines that spell out which customers to serve through which methods. Finally, a marketing and sales productivity (MSP) system, consisting of a central marketing database, can act as the central nervous system of a hybrid marketing system, helping managers create customized channels and service for specific customer segments.

  2. Toronto hybrid taxi pilot

    International Nuclear Information System (INIS)

    Stevens, M.; Marans, B.

    2009-10-01

    This paper provided details of a hybrid taxi pilot program conducted to compare the on-road performance of Toyota Camry hybrid vehicles against conventional vehicles over a 1-year period in order to determine the business case and air emission reductions associated with the use of hybrid taxi cabs. Over 750,000 km worth of fuel consumption was captured from 10 Toyota Camry hybrids, a Toyota Prius, and 5 non-hybrid Camry vehicles over an 18-month period. The average real world fuel consumption for the taxis demonstrated that the Toyota Prius has the lowest cost of ownership, while the non-hybrid Camry has the highest cost of ownership. Carbon dioxide (CO 2 ) reductions associated with the 10 Camry hybrid taxis were calculated at 236 tonnes over a 7-year taxi service life. Results suggested that the conversion of Toronto's 5680 taxis would yield annual CO 2 emission reductions of over 19,000 tonnes. All hybrid purchasers identified themselves as highly likely to purchase a hybrid again. 5 tabs., 9 figs.

  3. Hybrid FOSS Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Armstrong researchers are continuing their efforts to further develop FOSS technologies. A hybrid FOSS technique (HyFOSS) employs conventional continuous grating...

  4. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  5. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  6. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  7. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  8. From hybrid swarms to swarms of hybrids

    Science.gov (United States)

    Stohlgren, Thomas J.; Szalanski, Allen L; Gaskin, John F.; Young, Nicholas E.; West, Amanda; Jarnevich, Catherine S.; Tripodi, Amber

    2014-01-01

    Science has shown that the introgression or hybridization of modern humans (Homo sapiens) with Neanderthals up to 40,000 YBP may have led to the swarm of modern humans on earth. However, there is little doubt that modern trade and transportation in support of the humans has continued to introduce additional species, genotypes, and hybrids to every country on the globe. We assessed the utility of species distributions modeling of genotypes to assess the risk of current and future invaders. We evaluated 93 locations of the genus Tamarix for which genetic data were available. Maxent models of habitat suitability showed that the hybrid, T. ramosissima x T. chinensis, was slightly greater than the parent taxa (AUCs > 0.83). General linear models of Africanized honey bees, a hybrid cross of Tanzanian Apis mellifera scutellata and a variety of European honey bee including A. m. ligustica, showed that the Africanized bees (AUC = 0.81) may be displacing European honey bees (AUC > 0.76) over large areas of the southwestern U.S. More important, Maxent modeling of sub-populations (A1 and A26 mitotypes based on mDNA) could be accurately modeled (AUC > 0.9), and they responded differently to environmental drivers. This suggests that rapid evolutionary change may be underway in the Africanized bees, allowing the bees to spread into new areas and extending their total range. Protecting native species and ecosystems may benefit from risk maps of harmful invasive species, hybrids, and genotypes.

  9. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  10. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  11. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  12. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  13. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  14. Multivariate statistical methods a first course

    CERN Document Server

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  15. The disagreeable behaviour of the kappa statistic.

    Science.gov (United States)

    Flight, Laura; Julious, Steven A

    2015-01-01

    It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  17. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  18. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  19. Breast cancer statistics, 2011.

    Science.gov (United States)

    DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin

    2011-01-01

    In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population. Copyright © 2011 American Cancer Society, Inc.

  20. Hybridization in geese

    NARCIS (Netherlands)

    Ottenburghs, Jente; Hooft, van Pim; Wieren, van Sipke E.; Ydenberg, Ronald C.; Prins, Herbert H.T.

    2016-01-01

    The high incidence of hybridization in waterfowl (ducks, geese and swans) makes this bird group an excellent study system to answer questions related to the evolution and maintenance of species boundaries. However, knowledge on waterfowl hybridization is biased towards ducks, with a large

  1. Mirror hybrid reactor studies

    International Nuclear Information System (INIS)

    Bender, D.J.

    1978-01-01

    The hybrid reactor studies are reviewed. The optimization of the point design and work on a reference design are described. The status of the nuclear analysis of fast spectrum blankets, systems studies for fissile fuel producing hybrid reactor, and the mechanical design of the machine are reviewed

  2. Hybrid Universities in Malaysia

    Science.gov (United States)

    Lee, Molly; Wan, Chang Da; Sirat, Morshidi

    2017-01-01

    Are Asian universities different from those in Western countries? Premised on the hypothesis that Asian universities are different because of hybridization between Western academic models and local traditional cultures, this paper investigates the hybrid characteristics in Malaysian universities resulting from interaction between contemporary…

  3. Cardiac hybrid imaging

    Energy Technology Data Exchange (ETDEWEB)

    Gaemperli, Oliver [University Hospital Zurich, Cardiac Imaging, Zurich (Switzerland); University Hospital Zurich, Nuclear Cardiology, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiac Imaging, Zurich (Switzerland); Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland)

    2014-05-15

    Hybrid cardiac single photon emission computed tomography (SPECT)/CT imaging allows combined assessment of anatomical and functional aspects of cardiac disease. In coronary artery disease (CAD), hybrid SPECT/CT imaging allows detection of coronary artery stenosis and myocardial perfusion abnormalities. The clinical value of hybrid imaging has been documented in several subsets of patients. In selected groups of patients, hybrid imaging improves the diagnostic accuracy to detect CAD compared to the single imaging techniques. Additionally, this approach facilitates functional interrogation of coronary stenoses and guidance with regard to revascularization procedures. Moreover, the anatomical information obtained from CT coronary angiography or coronary artery calcium scores (CACS) adds prognostic information over perfusion data from SPECT. The use of cardiac hybrid imaging has been favoured by the dissemination of dedicated hybrid systems and the release of dedicated image fusion software, which allow simple patient throughput for hybrid SPECT/CT studies. Further technological improvements such as more efficient detector technology to allow for low-radiation protocols, ultra-fast image acquisition and improved low-noise image reconstruction algorithms will be instrumental to further promote hybrid SPECT/CT in research and clinical practice. (orig.)

  4. Hybrid job shop scheduling

    NARCIS (Netherlands)

    Schutten, Johannes M.J.

    1995-01-01

    We consider the problem of scheduling jobs in a hybrid job shop. We use the term 'hybrid' to indicate that we consider a lot of extensions of the classic job shop, such as transportation times, multiple resources, and setup times. The Shifting Bottleneck procedure can be generalized to deal with

  5. Hybrid Shipboard Microgrids

    DEFF Research Database (Denmark)

    Othman @ Marzuki, Muzaidi Bin; Anvari-Moghaddam, Amjad; Guerrero, Josep M.

    2017-01-01

    Strict regulation on emissions of air pollutants imposed by the maritime authorities has led to the introduction of hybrid microgrids to the shipboard power systems (SPSs) which acts toward energy efficient ships with less pollution. A hybrid energy system can include different means of generation...

  6. Hybrid intelligent engineering systems

    CERN Document Server

    Jain, L C; Adelaide, Australia University of

    1997-01-01

    This book on hybrid intelligent engineering systems is unique, in the sense that it presents the integration of expert systems, neural networks, fuzzy systems, genetic algorithms, and chaos engineering. It shows that these new techniques enhance the capabilities of one another. A number of hybrid systems for solving engineering problems are presented.

  7. Editorial: Hybrid Systems

    DEFF Research Database (Denmark)

    Olderog, Ernst-Rüdiger; Ravn, Anders Peter

    2007-01-01

    An introduction to three papers in a special issue on Hybrid Systems. These paper were first presented at an IFIP WG 2.2 meeting in Skagen 2005.......An introduction to three papers in a special issue on Hybrid Systems. These paper were first presented at an IFIP WG 2.2 meeting in Skagen 2005....

  8. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  9. Course on hybrid calculation

    International Nuclear Information System (INIS)

    Weill, J.; Tellier; Bonnemay; Craigne; Chareton; Di Falco

    1969-02-01

    After a definition of hybrid calculation (combination of analogue and digital calculation) with a distinction between series and parallel hybrid computing, and a description of a hybrid computer structure and of task sharing between computers, this course proposes a description of hybrid hardware used in Saclay and Cadarache computing centres, and of operations performed by these systems. The next part addresses issues related to programming languages and software. The fourth part describes how a problem is organised for its processing on these computers. Methods of hybrid analysis are then addressed: resolution of optimisation problems, of partial differential equations, and of integral equations by means of different methods (gradient, maximum principle, characteristics, functional approximation, time slicing, Monte Carlo, Neumann iteration, Fischer iteration)

  10. Hybrid functional pseudopotentials

    Science.gov (United States)

    Yang, Jing; Tan, Liang Z.; Rappe, Andrew M.

    2018-02-01

    The consistency between the exchange-correlation functional used in pseudopotential construction and in the actual density functional theory calculation is essential for the accurate prediction of fundamental properties of materials. However, routine hybrid density functional calculations at present still rely on generalized gradient approximation pseudopotentials due to the lack of hybrid functional pseudopotentials. Here, we present a scheme for generating hybrid functional pseudopotentials, and we analyze the importance of pseudopotential density functional consistency for hybrid functionals. For the PBE0 hybrid functional, we benchmark our pseudopotentials for structural parameters and fundamental electronic gaps of the Gaussian-2 (G2) molecular dataset and some simple solids. Our results show that using our PBE0 pseudopotentials in PBE0 calculations improves agreement with respect to all-electron calculations.

  11. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  12. Metabolite variation in hybrid corn grain from a large-scale multisite study

    Directory of Open Access Journals (Sweden)

    Mingjie Chen

    2016-06-01

    Full Text Available Metabolite composition is strongly affected by genotype, environment, and interactions between genotype and environment, although the extent of variation caused by these factors may depend upon the type of metabolite. To characterize the complexity of genotype, environment, and their interaction in hybrid seeds, 50 genetically diverse non-genetically modified (GM maize hybrids were grown in six geographically diverse locations in North America. Polar metabolites from 553 harvested corn grain samples were isolated and analyzed by gas chromatography–mass spectrometry and 45 metabolites detected in all samples were used to generate a data matrix for statistical analysis. There was moderate variation among biological replicates and across genotypes and test sites. The genotype effects were detected by univariate and Hierarchical clustering analyses (HCA when environmental effects were excluded. Overall, environment exerted larger effects than genotype, and polar metabolite accumulation showed a geographic effect. We conclude that it is possible to increase seed polar metabolite content in hybrid corn by selection of appropriate inbred lines and growing regions.

  13. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  14. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  15. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  16. Performances Study of a Hybrid Rocket Engine

    Directory of Open Access Journals (Sweden)

    Adrian-Nicolae BUTURACHE

    2018-06-01

    Full Text Available This paper presents a study which analyses the functioning and performances optimization of a hybrid rocket engine based on gaseous oxygen and polybutadiene polymer (HTPB. Calculations were performed with NASA CEA software in order to obtain the parameters resulted following the combustion process. Using these parameters, the main parameters of the hybrid rocket engine were optimized. Using the calculus previously stated, an experimental rocket engine producing 100 N of thrust was pre-dimensioned, followed by an optimization of the rocket engine as a function of several parameters. Having the geometry and the main parameters of the hybrid rocket engine combustion process, numerical simulations were performed in the CFX – ANSYS commercial software, which allowed visualizing the flow field and the jet expansion. Finally, the analytical calculus was validated through numerical simulations.

  17. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  18. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  19. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  20. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  1. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  2. Mineral industry statistics 1975

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)

  3. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  4. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  5. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  6. Interploidal hybridization and mating patterns in the Sphagnum subsecundum complex.

    Science.gov (United States)

    Ricca, M; Szövényi, P; Temsch, E M; Johnson, M G; Shaw, A J

    2011-08-01

    Polyploidization is thought to result in instant sympatric speciation, but several cases of hybrid zones between one of the parental species and its polyploid derivative have been documented. Previous work showed that diploid Sphagnum lescurii is an allopolyploid derived from the haploids S. lescurii (maternal progenitor) and S. subsecundum (paternal progenitor). Here, we report the results from analyses of a population where allodiploid and haploid S. lescurii co-occur and produce sporophytes. We tested (i) whether haploids and diploids form hybrid triploid sporophytes; (ii) how hybrid and nonhybrid sporophytes compare in fitness; (iii) whether hybrid sporophytes form viable spores; (iv) the ploidy of any viable gametophyte offspring from hybrid sporophytes; (v) the relative viability of sporelings derived from hybrid and nonhybrid sporophytes; and (vi) if interploidal hybridization results in introgression between the allopolyploid and its haploid progenitor. We found that triploid hybrid sporophytes do occur and are larger than nonhybrid sporophytes, but exhibit very low germination percentages and produce sporelings that develop more slowly than those from nonhybrid sporophytes. All sporophytes attached to haploid gametophytes were triploid and were sired by diploid males, but all sporophytes attached to diploid gametophytes were tetraploid. This asymmetric pattern of interploidal hybridization is related to an absence of haploid male gametophytes in the population. Surprisingly, all sporelings from triploid sporophytes were triploid, yet were genetically variable, suggesting some form of aberrant meiosis that warrants further study. There was limited (but some) evidence of introgression between allodiploid and haploid S. lescurii. © 2011 Blackwell Publishing Ltd.

  7. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  8. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  9. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  16. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  17. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  18. Annual Statistical Supplement, 2017

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  19. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  1. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  2. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  3. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  4. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  5. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  6. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  7. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  8. A Statistical Analysis of Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Stephen Chan

    2017-05-01

    Full Text Available We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.

  9. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    Science.gov (United States)

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  10. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  11. ROBUST-HYBRID GENETIC ALGORITHM FOR A FLOW-SHOP SCHEDULING PROBLEM (A Case Study at PT FSCM Manufacturing Indonesia

    Directory of Open Access Journals (Sweden)

    Johan Soewanda

    2007-01-01

    Full Text Available This paper discusses the application of Robust Hybrid Genetic Algorithm to solve a flow-shop scheduling problem. The proposed algorithm attempted to reach minimum makespan. PT. FSCM Manufacturing Indonesia Plant 4's case was used as a test case to evaluate the performance of the proposed algorithm. The proposed algorithm was compared to Ant Colony, Genetic-Tabu, Hybrid Genetic Algorithm, and the company's algorithm. We found that Robust Hybrid Genetic produces statistically better result than the company's, but the same as Ant Colony, Genetic-Tabu, and Hybrid Genetic. In addition, Robust Hybrid Genetic Algorithm required less computational time than Hybrid Genetic Algorithm

  12. PV-hybrid and mini-grid

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    photovoltaic - fuel cell - direct storage - power supply units; (11) Djungle power - A more remote AC bus; (12) Quality control tools applied to a PV micro-grid in Ecuador; (13) Integral Evaluation of energy supply systems at Mountain Refuges; (14) Hyress project: Study case of Tunisia. Installation, set-up and first results; (15) PV hybrid systems on Mountain Huts: The experience with the project CAI ENERGIA 2000; (16) Process management in zero-emission communities: Adaptation of consumption and production; (17) TRNSYS simulation of a system consisted of PV panels and H{sub 2} production and storage to feed a Remote Telecom Application (HildrosolarH2); (18) Photovoltaic forecasting: A state of the art; (19) PV*SOL 5.0 standalone - Simulation of a stand-alone AC system; (20) Effect of wind speed and solar irradiation on the optimization of a PV wind battery system to supply a Telecommunications station; (21) Polysun: PV, wind and power-heat cogeneration in one design tool; (22) Comparative study between distributed and centralised PV generation in Island power systems under variable weather conditions; (23) Development of a test facility for PV-wind hybrid energy systems; (24) Evaluation of a micro PV-wind hybrid system in Nordic climate conditions; (25) Prismes: The INES micro-grid platform; (26) Estimation of statistical electric energy demand profiles of non electrified regions in Northern Brazil.

  13. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.

  14. Generalized interpolative quantum statistics

    International Nuclear Information System (INIS)

    Ramanathan, R.

    1992-01-01

    A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently

  15. Handbook of Spatial Statistics

    CERN Document Server

    Gelfand, Alan E

    2010-01-01

    Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.

  16. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  17. Statistical tables 2003

    International Nuclear Information System (INIS)

    2003-01-01

    The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)

  18. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  19. Practical statistics for educators

    CERN Document Server

    Ravid, Ruth

    2014-01-01

    Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.

  20. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...