Preliminary results of statistical dynamic experiments on a heat exchanger
International Nuclear Information System (INIS)
Corran, E.R.; Cummins, J.D.
1962-10-01
The inherent noise signals present in a heat exchanger have been recorded and analysed in order to determine some of the statistical dynamic characteristics of the heat exchanger. These preliminary results show that the primary side temperature frequency response may be determined by analysing the inherent noise. The secondary side temperature frequency response and cross coupled temperature frequency responses between primary and secondary are poorly determined because of the presence of a non-stationary noise source in the secondary circuit of this heat exchanger. This may be overcome by correlating the dependent variables with an externally applied noise signal. Some preliminary experiments with an externally applied random telegraph type of signal are reported. (author)
Statistical analyses of extreme food habits
International Nuclear Information System (INIS)
Breuninger, M.; Neuhaeuser-Berthold, M.
2000-01-01
This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de
Applied statistics a handbook of BMDP analyses
Snell, E J
1987-01-01
This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...
Surface Properties of TNOs: Preliminary Statistical Analysis
Antonieta Barucci, Maria; Fornasier, S.; Alvarez-Cantal, A.; de Bergh, C.; Merlin, F.; DeMeo, F.; Dumas, C.
2009-09-01
An overview of the surface properties based on the last results obtained during the Large Program performed at ESO-VLT (2007-2008) will be presented. Simultaneous high quality visible and near-infrared spectroscopy and photometry have been carried out on 40 objects with various dynamical properties, using FORS1 (V), ISAAC (J) and SINFONI (H+K bands) mounted respectively at UT2, UT1 and UT4 VLT-ESO telescopes (Cerro Paranal, Chile). For spectroscopy we computed the spectral slope for each object and searched for possible rotational inhomogeneities. A few objects show features in their visible spectra such as Eris, whose spectral bands are displaced with respect to pure methane-ice. We identify new faint absorption features on 10199 Chariklo and 42355 Typhon, possibly due to the presence of aqueous altered materials. The H+K band spectroscopy was performed with the new instrument SINFONI which is a 3D integral field spectrometer. While some objects show no diagnostic spectral bands, others reveal surface deposits of ices of H2O, CH3OH, CH4, and N2. To investigate the surface properties of these bodies, a radiative transfer model has been applied to interpret the entire 0.4-2.4 micron spectral region. The diversity of the spectra suggests that these objects represent a substantial range of bulk compositions. These different surface compositions can be diagnostic of original compositional diversity, interior source and/or different evolution with different physical processes affecting the surfaces. A statistical analysis is in progress to investigate the correlation of the TNOs’ surface properties with size and dynamical properties.
The SNS target station preliminary Title I shielding analyses
International Nuclear Information System (INIS)
Johnson, J.O.; Santoro, R.T.; Lillie, R.A.; Barnes, J.M.; McNeilly, G.S.
2000-01-01
The Department of Energy (DOE) has given the Spallation Neutron Source (SNS) project approval to begin Title I design of the proposed facility to be built at Oak Ridge National Laboratory (ORNL). During the conceptual design phase of the SNS project, the target station bulk-biological shield was characterized and the activation of the major targets station components was calculated. Shielding requirements were assessed with respect to weight, space, and dose-rate constraints for operating, shut-down, and accident conditions utilizing the SNS shield design criteria, DOE Order 5480.25, and requirements specified in 10 CFR 835. Since completion of the conceptual design phase, there have been major design changes to the target station as a result of the initial shielding and activation analyses, modifications brought about due to engineering concerns, and feedback from numerous external review committees. These design changes have impacted the results of the conceptual design analyses, and consequently, have required a re-investigation of the new design. Furthermore, the conceptual design shielding analysis did not address many of the details associated with the engineering design of the target station. In this paper, some of the proposed SNS target station preliminary Title I shielding design analyses will be presented. The SNS facility (with emphasis on the target station), shielding design requirements, calculational strategy, and source terms used in the analyses will be described. Preliminary results and conclusions, along with recommendations for additional analyses, will also be presented. (author)
Preliminary analyses of AP600 using RELAP5
International Nuclear Information System (INIS)
Modro, S.M.; Beelman, R.J.; Fisher, J.E.
1991-01-01
This paper presents results of preliminary analyses of the proposed Westinghouse Electric Corporation AP600 design. AP600 is a two loop, 600 MW (e) pressurized water reactor (PWR) arranged in a two hot leg, four cold leg nuclear steam supply system (NSSS) configuration. In contrast to the present generation of PWRs it is equipped with passive emergency core coolant (ECC) systems. Also, the containment and the safety systems of the AP600 interact with the reactor coolant system and each other in a more integral fashion than present day PWRs. The containment in this design is the ultimate heat sink for removal of decay heat to the environment. Idaho National Engineering Laboratory (INEL) has studied applicability of the RELAP5 code to AP600 safety analysis and has developed a model of the AP600 for the Nuclear Regulatory Commission. The model incorporates integral modeling of the containment, NSSS and passive safety systems. Best available preliminary design data were used. Nodalization sensitivity studies were conducted to gain experience in modeling of systems and conditions which are beyond the applicability of previously established RELAP5 modeling guidelines or experience. Exploratory analyses were then undertaken to investigate AP600 system response during postulated accident conditions. Four small break LOCA calculations and two large break LOCA calculations were conducted
Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA
Thorndahl, S.; Smith, J. A.; Krajewski, W. F.
2012-04-01
During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and
Preliminary drift design analyses for nuclear waste repository in tuff
International Nuclear Information System (INIS)
Hardy, M.P.; Brechtel, C.E.; Goodrich, R.R.; Bauer, S.J.
1990-01-01
The Yucca Mountain Project (YMP) is examining the feasibility of siting a repository for high-level nuclear waste at Yucca Mountain, on and adjacent to the Nevada Test Site (NTS). The proposed repository will be excavated in the Topopah Spring Member, which is a moderately fractured, unsaturated, welded tuff. Excavation stability will be required during construction, waste emplacement, retrieval (if required), and closure to ensure worker safety. The subsurface excavations will be subject to stress changes resulting from thermal expansion of the rock mass and seismic events associated with regional tectonic activity and underground nuclear explosions (UNEs). Analyses of drift stability are required to assess the acceptable waste emplacement density, to design the drift shapes and ground support systems, and to establish schedules and cost of construction. This paper outlines the proposed methodology to assess drift stability and then focuses on an example of its application to the YMP repository drifts based on preliminary site data. Because site characterization activities have not begun, the database currently lacks the extensive site-specific field and laboratory data needed to form conclusions as to the final ground support requirements. This drift design methodology will be applied and refined as more site-specific data are generated and as analytical techniques and methodologies are verified during the site characterization process
Statistical analyses of conserved features of genomic islands in bacteria.
Guo, F-B; Xia, Z-K; Wei, W; Zhao, H-L
2014-03-17
We performed statistical analyses of five conserved features of genomic islands of bacteria. Analyses were made based on 104 known genomic islands, which were identified by comparative methods. Four of these features include sequence size, abnormal G+C content, flanking tRNA gene, and embedded mobility gene, which are frequently investigated. One relatively new feature, G+C homogeneity, was also investigated. Among the 104 known genomic islands, 88.5% were found to fall in the typical length of 10-200 kb and 80.8% had G+C deviations with absolute values larger than 2%. For the 88 genomic islands whose hosts have been sequenced and annotated, 52.3% of them were found to have flanking tRNA genes and 64.7% had embedded mobility genes. For the homogeneity feature, 85% had an h homogeneity index less than 0.1, indicating that their G+C content is relatively uniform. Taking all the five features into account, 87.5% of 88 genomic islands had three of them. Only one genomic island had only one conserved feature and none of the genomic islands had zero features. These statistical results should help to understand the general structure of known genomic islands. We found that larger genomic islands tend to have relatively small G+C deviations relative to absolute values. For example, the absolute G+C deviations of 9 genomic islands longer than 100,000 bp were all less than 5%. This is a novel but reasonable result given that larger genomic islands should have greater restrictions in their G+C contents, in order to maintain the stable G+C content of the recipient genome.
Statistical reliability analyses of two wood plastic composite extrusion processes
International Nuclear Information System (INIS)
Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.
2011-01-01
Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.
Methodology development for statistical evaluation of reactor safety analyses
International Nuclear Information System (INIS)
Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.
1976-07-01
In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Non-Statistical Methods of Analysing of Bankruptcy Risk
Directory of Open Access Journals (Sweden)
Pisula Tomasz
2015-06-01
Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.
A weighted U statistic for association analyses considering genetic heterogeneity.
Wei, Changshuai; Elston, Robert C; Lu, Qing
2016-07-20
Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Statistical and extra-statistical considerations in differential item functioning analyses
Directory of Open Access Journals (Sweden)
G. K. Huysamen
2004-10-01
Full Text Available This article briefly describes the main procedures for performing differential item functioning (DIF analyses and points out some of the statistical and extra-statistical implications of these methods. Research findings on the sources of DIF, including those associated with translated tests, are reviewed. As DIF analyses are oblivious of correlations between a test and relevant criteria, the elimination of differentially functioning items does not necessarily improve predictive validity or reduce any predictive bias. The implications of the results of past DIF research for test development in the multilingual and multi-cultural South African society are considered. Opsomming Hierdie artikel beskryf kortliks die hoofprosedures vir die ontleding van differensiële itemfunksionering (DIF en verwys na sommige van die statistiese en buite-statistiese implikasies van hierdie metodes. ’n Oorsig word verskaf van navorsingsbevindings oor die bronne van DIF, insluitend dié by vertaalde toetse. Omdat DIF-ontledings nie die korrelasies tussen ’n toets en relevante kriteria in ag neem nie, sal die verwydering van differensieel-funksionerende items nie noodwendig voorspellingsgeldigheid verbeter of voorspellingsydigheid verminder nie. Die implikasies van vorige DIF-navorsingsbevindings vir toetsontwikkeling in die veeltalige en multikulturele Suid-Afrikaanse gemeenskap word oorweeg.
SARDA HITL Preliminary Human Factors Measures and Analyses
Hyashi, Miwa; Dulchinos, Victoria
2012-01-01
Human factors data collected during the SARDA HITL Simulation Experiment include a variety of subjective measures, including the NASA TLX, questionnaire questions regarding situational awareness, advisory usefulness, UI usability, and controller trust. Preliminary analysis of the TLX data indicate that workload may not be adversely affected by use of the advisories, additionally, the controller's subjective ratings of the advisories may suggest acceptance of the tool.
Temporal scaling and spatial statistical analyses of groundwater level fluctuations
Sun, H.; Yuan, L., Sr.; Zhang, Y.
2017-12-01
Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.
Using statistical inference for decision making in best estimate analyses
International Nuclear Information System (INIS)
Sermer, P.; Weaver, K.; Hoppe, F.; Olive, C.; Quach, D.
2008-01-01
For broad classes of safety analysis problems, one needs to make decisions when faced with randomly varying quantities which are also subject to errors. The means for doing this involves a statistical approach which takes into account the nature of the physical problems, and the statistical constraints they impose. We describe the methodology for doing this which has been developed at Nuclear Safety Solutions, and we draw some comparisons to other methods which are commonly used in Canada and internationally. Our methodology has the advantages of being robust and accurate and compares favourably to other best estimate methods. (author)
Additional methodology development for statistical evaluation of reactor safety analyses
International Nuclear Information System (INIS)
Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.
1977-03-01
The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident
Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.
2015-12-01
Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley
Statistical reporting errors and collaboration on statistical analyses in psychological science
Veldkamp, C.L.S.; Nuijten, M.B.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we
Fluctuations of Lake Orta water levels: preliminary analyses
Directory of Open Access Journals (Sweden)
Helmi Saidi
2016-04-01
Full Text Available While the effects of past industrial pollution on the chemistry and biology of Lake Orta have been well documented, annual and seasonal fluctuations of lake levels have not yet been studied. Considering their potential impacts on both the ecosystem and on human safety, fluctuations in lake levels are an important aspect of limnological research. In the enormous catchment of Lake Maggiore, there are many rivers and lakes, and the amount of annual precipitation is both high and concentrated in spring and autumn. This has produced major flood events, most recently in November 2014. Flood events are also frequent on Lake Orta, occurring roughly triennially since 1917. The 1926, 1951, 1976 and 2014 floods were severe, with lake levels raised from 2.30 m to 3.46 m above the hydrometric zero. The most important event occurred in 1976, with a maximum level equal to 292.31 m asl and a return period of 147 years. In 2014 the lake level reached 291.89 m asl and its return period was 54 years. In this study, we defined trends and temporal fluctuations in Lake Orta water levels from 1917 to 2014, focusing on extremes. We report both annual maximum and seasonal variations of the lake water levels over this period. Both Mann-Kendall trend tests and simple linear regression were utilized to detect monotonic trends in annual and seasonal extremes, and logistic regression was used to detect trends in the number of flood events. Lake level decreased during winter and summer seasons, and a small but statistically non-significant positive trend was found in the number of flood events over the period. We provide estimations of return period for lake levels, a metric which could be used in planning lake flood protection measures.
Preliminary analyses of Li jet flows for the IFMIF target
International Nuclear Information System (INIS)
Ida, Mizuho; Kato, Yoshio; Nakamura, Hideo; Maekawa, Hiroshi
1997-03-01
The characteristics of liquid lithium (Li) plane jet flowing along a concave wall were studied using a multi-dimensional numerical code, FLOW-3D, as part of the two-year conceptual design activity (CDA) of the International Fusion Materials Irradiation Facility (IFMIF) that started in February 1995. The IFMIF will provide high flux, high energy (∼14MeV) neutron irradiation field by deuteron-Li reaction in the Li jet target for testing and development of low-activation and damage-resistant fusion materials. The Li jet target flow at high-velocity (≤ 20m/s) in vacuum, and should adequately remove the intense deuteron beam power (≤ 10MW). The two-dimensional analyses on the thermal and hydraulic responses of the target flow, under the conditions proposed in the IFMIF-CDA, indicated enough temperature margins to avoid significant vaporization and voiding respectively at the jet free surface and the peak temperature location in the jet by keeping the flow stability. (author)
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.
Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
Energy Technology Data Exchange (ETDEWEB)
Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)
2013-01-01
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
International Nuclear Information System (INIS)
Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe
2013-01-01
Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)
Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd
2012-01-01
Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…
SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit
Directory of Open Access Journals (Sweden)
Annie Chu
2009-04-01
Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.
International Nuclear Information System (INIS)
Hill, C.A.; Livingston, D.E.
1993-09-01
This chemical analysis study is part of the research program of the Yucca Mountain Project intended to provide the State of Nevada with a detailed assessment of the geology and geochemistry of Yucca Mountain and adjacent regions. This report is preliminary in the sense that more chemical analyses may be needed in the future and also in the sense that these chemical analyses should be considered as a small part of a much larger geological data base. The interpretations discussed herein may be modified as that larger data base is examined and established. All of the chemical analyses performed to date are shown in Table 1. There are three parts to this table: (1) trace element analyses on rocks (limestone and tuff) and minerals (calcite/opal), (2) rare earth analyses on rocks (tuff) and minerals (calcite/opal), and (3) major element analyses + CO 2 on rocks (tuff) and detritus sand. In this report, for each of the three parts of the table, the data and its possible significance will be discussed first, then some overall conclusions will be made, and finally some recommendations for future work will be offered
Kanda, Junya
2016-01-01
The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most
International Nuclear Information System (INIS)
Somolova, Marketa; Terada, Atsuhiko; Takegami, Hiroaki; Iwatsuki, Jin
2008-12-01
Japan Atomic Energy Agency has been conducting a conceptual design study of nuclear hydrogen demonstration plant, that is, a thermal-chemical IS process hydrogen plant coupled with the High temperature Engineering Test Reactor (HTTR-IS), which will be planed to produce a large amount of hydrogen up to 1000m 3 /h. As part of the conceptual design work of the HTTR-IS system, preliminary analyses on small break of a hydrogen pipeline in the IS process hydrogen plant was carried out as a first step of the safety analyses. This report presents analytical results of hydrogen diffusion behaviors predicted with a CFD code, in which a diffusion model focused on the turbulent Schmidt number was incorporated. By modifying diffusion model, especially a constant accompanying the turbulent Schmidt number in the diffusion term, analytical results was made agreed well with the experimental results. (author)
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
European passive plant program preliminary safety analyses to support system design
International Nuclear Information System (INIS)
Saiu, Gianfranco; Barucca, Luciana; King, K.J.
1999-01-01
In 1994, a group of European Utilities, together with Westinghouse and its Industrial Partner GENESI (an Italian consortium including ANSALDO and FIAT), initiated a program designated EPP (European Passive Plant) to evaluate Westinghouse Passive Nuclear Plant Technology for application in Europe. In the Phase 1 of the European Passive Plant Program which was completed in 1996, a 1000 MWe passive plant reference design (EP1000) was established which conforms to the European Utility Requirements (EUR) and is expected to meet the European Safety Authorities requirements. Phase 2 of the program was initiated in 1997 with the objective of developing the Nuclear Island design details and performing supporting analyses to start development of Safety Case Report (SCR) for submittal to European Licensing Authorities. The first part of Phase 2, 'Design Definition' phase (Phase 2A) was completed at the end of 1998, the main efforts being design definition of key systems and structures, development of the Nuclear Island layout, and performing preliminary safety analyses to support design efforts. Incorporation of the EUR has been a key design requirement for the EP1000 form the beginning of the program. Detailed design solutions to meet the EUR have been defined and the safety approach has also been developed based on the EUR guidelines. The present paper describes the EP1000 approach to safety analysis and, in particular, to the Design Extension Conditions that, according to the EUR, represent the preferred method for giving consideration to the Complex Sequences and Severe Accidents at the design stage without including them in the design bases conditions. Preliminary results of some DEC analyses and an overview of the probabilistic safety assessment (PSA) are also presented. (author)
Preliminary study of energy confinement data with a statistical analysis system in HL-2A tokamak
International Nuclear Information System (INIS)
Xu Yuan; Cui Zhengying; Ji Xiaoquan; Dong Chunfeng; Yang Qingwei; O J W F Kardaun
2010-01-01
Taking advantage of the HL-2A experimental data,an energy confinement database facing ITERL DB2.0 version has been originally established. As for this database,a world widely used statistical analysis system (SAS) has been adopted for the first time to analyze and evaluate the confinement data from HL-2A and the research on scaling laws of energy confinement time corresponding to plasma density is developed, some preliminary results having been achieved. Finally, through comparing with both ITER scaling law and previous ASDEX database, the investigation about L-mode confinement quality on HL-2A and influence of temperature on Spitzer resistivity will be discussed. (authors)
Statistical analyses in the study of solar wind-magnetosphere coupling
International Nuclear Information System (INIS)
Baker, D.N.
1985-01-01
Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms
Preliminary thermal-hydraulic and structural strength analyses for pre-moderator of cold moderator
International Nuclear Information System (INIS)
Aso, Tomokazu; Kaminaga, Masanori; Terada, Atsuhiko; Hino, Ryutaro
2001-08-01
A light-water cooled pre-moderator with a thin-walled structure made of aluminum alloy is installed around a liquid hydrogen moderator in order to enhance the neutron performance of a MW-scale spallation target system which is being developed in the Japan Atomic Energy Research Institute (JAERI). Since the pre-moderator is needed to be located close to a target working as a neutron source, it is indispensable to remove nuclear heat deposition in the pre-moderator effectively by means of smooth water flow without flow stagnation. Also, the structural integrity of the thin-walled structure should be kept against the water pressure. Preliminary thermal-hydraulic analytical results showed that the water temperature rise could be suppressed less than 1degC while keeping the smooth water flow, which would assure the expected neutron performance. As for the structural integrity, several measures to meet allowable stress conditions of aluminum alloy were proposed on the basis of the preliminary structural strength analyses. (author)
Boysen, Elena; Schiller, Birgitta; Mörtl, Kathrin; Gündel, Harald; Hölzer, Michael
2018-01-10
Psychosocial working conditions attract more and more attention when it comes to mental health in the workplace. Trying to support managers to deal with their own as well as their employees' psychological risk factors, we conducted a specific manager training. Within this investigation, we wanted to learn about the training's effects and acceptance. A single-day manager training was provided in a large industrial company in Germany. The participants were asked to fill out questionnaires regarding their own physical and mental health condition as well as their working situation. Questionnaires were distributed at baseline, 3-month, and 12-month follow-up. At this point of time the investigation is still ongoing. The current article focuses on short-term preliminary effects. Analyses only included participants that already completed baseline and three months follow-up. Preliminary results from three-month follow-up survey ( n = 33, nmale = 30, Mage = 47.5) indicated positive changes in the manager's mental health condition measured by the Patient Health Questionnaire for depression (PHQ-9: Mt1 = 3.82, Mt2 = 3.15). Training managers about common mental disorders and risk factors at the workplace within a single-day workshop seems to promote positive effects on their own mental health. Especially working with the managers on their own early stress symptoms might have been an important element.
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1985-10-01
Preliminary analyses of scenarios for human interference with the performance of a radioactive waste repository in a deep salt formation are presented. The following scenarios are analyzed: (1) the U-Tube Connection Scenario involving multiple connections between the repository and the overlying aquifer system; (2) the Single Borehole Intrusion Scenario involving penetration of the repository by an exploratory borehole that simultaneously connects the repository with overlying and underlying aquifers; and (3) the Pressure Release Scenario involving inflow of water to saturate any void space in the repository prior to creep closure with subsequent release under near lithostatic pressures following creep closure. The methodology to evaluate repository performance in these scenarios is described and this methodology is applied to reference systems in three candidate formations: bedded salt in the Palo Duro Basin, Texas; bedded salt in the Paradox Basin, Utah; and the Richton Salt Dome, Mississippi, of the Gulf Coast Salt Dome Basin
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
Energy Technology Data Exchange (ETDEWEB)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
Clerc, F; Njiki-Menga, G-H; Witschger, O
2013-01-01
Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a
Statistical analyses of the data on occupational radiation expousure at JPDR
International Nuclear Information System (INIS)
Kato, Shohei; Anazawa, Yutaka; Matsuno, Kenji; Furuta, Toshishiro; Akiyama, Isamu
1980-01-01
In the statistical analyses of the data on occupational radiation exposure at JPDR, statistical features were obtained as follows. (1) The individual doses followed log-normal distribution. (2) In the distribution of doses from one job in controlled area, the logarithm of the mean (μ) depended on the exposure rate (γ(mR/h)), and the σ correlated to the nature of the job and normally distributed. These relations were as follows. μ = 0.48 ln r-0.24, σ = 1.2 +- 0.58 (3) For the data containing different groups, the distribution of doses showed a polygonal line on the log-normal probability paper. (4) Under the dose limitation, the distribution of the doses showed asymptotic curve along the limit on the log-normal probability paper. (author)
Preliminary Results of Ancillary Safety Analyses Supporting TREAT LEU Conversion Activities
Energy Technology Data Exchange (ETDEWEB)
Brunett, A. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Fei, T. [Argonne National Lab. (ANL), Argonne, IL (United States); Strons, P. S. [Argonne National Lab. (ANL), Argonne, IL (United States); Papadias, D. D. [Argonne National Lab. (ANL), Argonne, IL (United States); Hoffman, E. A. [Argonne National Lab. (ANL), Argonne, IL (United States); Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)
2015-10-01
Report (FSAR) [3]. Depending on the availability of historical data derived from HEU TREAT operation, results calculated for the LEU core are compared to measurements obtained from HEU TREAT operation. While all analyses in this report are largely considered complete and have been reviewed for technical content, it is important to note that all topics will be revisited once the LEU design approaches its final stages of maturity. For most safety significant issues, it is expected that the analyses presented here will be bounding, but additional calculations will be performed as necessary to support safety analyses and safety documentation. It should also be noted that these analyses were completed as the LEU design evolved, and therefore utilized different LEU reference designs. Preliminary shielding, neutronic, and thermal hydraulic analyses have been completed and have generally demonstrated that the various LEU core designs will satisfy existing safety limits and standards also satisfied by the existing HEU core. These analyses include the assessment of the dose rate in the hodoscope room, near a loaded fuel transfer cask, above the fuel storage area, and near the HEPA filters. The potential change in the concentration of tramp uranium and change in neutron flux reaching instrumentation has also been assessed. Safety-significant thermal hydraulic items addressed in this report include thermally-induced mechanical distortion of the grid plate, and heating in the radial reflector.
arXiv Statistical Analyses of Higgs- and Z-Portal Dark Matter Models
Ellis, John; Marzola, Luca; Raidal, Martti
2018-06-12
We perform frequentist and Bayesian statistical analyses of Higgs- and Z-portal models of dark matter particles with spin 0, 1/2 and 1. Our analyses incorporate data from direct detection and indirect detection experiments, as well as LHC searches for monojet and monophoton events, and we also analyze the potential impacts of future direct detection experiments. We find acceptable regions of the parameter spaces for Higgs-portal models with real scalar, neutral vector, Majorana or Dirac fermion dark matter particles, and Z-portal models with Majorana or Dirac fermion dark matter particles. In many of these cases, there are interesting prospects for discovering dark matter particles in Higgs or Z decays, as well as dark matter particles weighing $\\gtrsim 100$ GeV. Negative results from planned direct detection experiments would still allow acceptable regions for Higgs- and Z-portal models with Majorana or Dirac fermion dark matter particles.
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
The intervals method: a new approach to analyse finite element outputs using multivariate statistics
Directory of Open Access Journals (Sweden)
Jordi Marcé-Nogué
2017-10-01
Full Text Available Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.
The intervals method: a new approach to analyse finite element outputs using multivariate statistics
De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep
2017-01-01
Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107
Directory of Open Access Journals (Sweden)
H. Kojima
1999-01-01
Full Text Available We present the characteristics of the Electrostatic Solitary Waves (ESW observed by the Geotail spacecraft in the plasma sheet boundary layer based on the statistical analyses. We also discuss the results referring to a model of ESW generation due to electron beams, which is proposed by computer simulations. In this generation model, the nonlinear evolution of Langmuir waves excited by electron bump-on-tail instabilities leads to formation of isolated electrostatic potential structures corresponding to "electron hole" in the phase space. The statistical analyses of the Geotail data, which we conducted under the assumption that polarity of ESW potentials is positive, show that most of ESW propagate in the same direction of electron beams, which are observed by the plasma instrument, simultaneously. Further, we also find that the ESW potential energy is much smaller than the background electron thermal energy and that the ESW potential widths are typically shorter than 60 times of local electron Debye length when we assume that the ESW potentials travel in the same velocity of electron beams. These results are very consistent with the ESW generation model that the nonlinear evolution of electron bump-on-tail instability leads to the formation of electron holes in the phase space.
A weighted U-statistic for genetic association analyses of sequencing data.
Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing
2014-12-01
With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.
Emoto, K.; Saito, T.; Shiomi, K.
2017-12-01
Short-period (2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.
Directory of Open Access Journals (Sweden)
Sunando Roy
2009-10-01
Full Text Available Feline immunodeficiency virus (FIV and human immunodeficiency virus (HIV are recently identified lentiviruses that cause progressive immune decline and ultimately death in infected cats and humans. It is of great interest to understand how to prevent immune system collapse caused by these lentiviruses. We recently described that disease caused by a virulent FIV strain in cats can be attenuated if animals are first infected with a feline immunodeficiency virus derived from a wild cougar. The detailed temporal tracking of cat immunological parameters in response to two viral infections resulted in high-dimensional datasets containing variables that exhibit strong co-variation. Initial analyses of these complex data using univariate statistical techniques did not account for interactions among immunological response variables and therefore potentially obscured significant effects between infection state and immunological parameters.Here, we apply a suite of multivariate statistical tools, including Principal Component Analysis, MANOVA and Linear Discriminant Analysis, to temporal immunological data resulting from FIV superinfection in domestic cats. We investigated the co-variation among immunological responses, the differences in immune parameters among four groups of five cats each (uninfected, single and dual infected animals, and the "immune profiles" that discriminate among them over the first four weeks following superinfection. Dual infected cats mount an immune response by 24 days post superinfection that is characterized by elevated levels of CD8 and CD25 cells and increased expression of IL4 and IFNgamma, and FAS. This profile discriminates dual infected cats from cats infected with FIV alone, which show high IL-10 and lower numbers of CD8 and CD25 cells.Multivariate statistical analyses demonstrate both the dynamic nature of the immune response to FIV single and dual infection and the development of a unique immunological profile in dual
DEFF Research Database (Denmark)
Frandsen, Tove Faber; Nicolaisen, Jeppe
2017-01-01
Using statistical methods to analyse digital material for patterns makes it possible to detect patterns in big data that we would otherwise not be able to detect. This paper seeks to exemplify this fact by statistically analysing a large corpus of references in systematic reviews. The aim...
Statistical analyses to support guidelines for marine avian sampling. Final report
Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris
2012-01-01
distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.
Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.
Schmitt, M; Grub, J; Heib, F
2015-06-01
Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Statistical analyses of incidents on onshore gas transmission pipelines based on PHMSA database
International Nuclear Information System (INIS)
Lam, Chio; Zhou, Wenxing
2016-01-01
This article reports statistical analyses of the mileage and pipe-related incidents data corresponding to the onshore gas transmission pipelines in the US between 2002 and 2013 collected by the Pipeline Hazardous Material Safety Administration of the US Department of Transportation. The analysis indicates that there are approximately 480,000 km of gas transmission pipelines in the US, approximately 60% of them more than 45 years old as of 2013. Eighty percent of the pipelines are Class 1 pipelines, and about 20% of the pipelines are Classes 2 and 3 pipelines. It is found that the third-party excavation, external corrosion, material failure and internal corrosion are the four leading failure causes, responsible for more than 75% of the total incidents. The 12-year average rate of rupture equals 3.1 × 10"−"5 per km-year due to all failure causes combined. External corrosion is the leading cause for ruptures: the 12-year average rupture rate due to external corrosion equals 1.0 × 10"−"5 per km-year and is twice the rupture rate due to the third-party excavation or material failure. The study provides insights into the current state of gas transmission pipelines in the US and baseline failure statistics for the quantitative risk assessments of such pipelines. - Highlights: • Analyze PHMSA pipeline mileage and incident data between 2002 and 2013. • Focus on gas transmission pipelines. • Leading causes for pipeline failures are identified. • Provide baseline failure statistics for risk assessments of gas transmission pipelines.
Directory of Open Access Journals (Sweden)
Masnawi MUSTAFFA
2016-12-01
Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.
Csillik, O.; Evans, I. S.; Drăguţ, L.
2015-03-01
Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.
Bartsch, Adam J; Benzel, Edward C; Miele, Vincent J; Morr, Douglas R; Prakash, Vikas
2012-05-01
In spite of ample literature pointing to rotational and combined impact dosage being key contributors to head and neck injury, boxing and mixed martial arts (MMA) padding is still designed to primarily reduce cranium linear acceleration. The objects of this study were to quantify preliminary linear and rotational head impact dosage for selected boxing and MMA padding in response to hook punches; compute theoretical skull, brain, and neck injury risk metrics; and statistically compare the protective effect of various glove and head padding conditions. An instrumented Hybrid III 50th percentile anthropomorphic test device (ATD) was struck in 54 pendulum impacts replicating hook punches at low (27-29 J) and high (54-58 J) energy. Five padding combinations were examined: unpadded (control), MMA glove-unpadded head, boxing glove-unpadded head, unpadded pendulum-boxing headgear, and boxing glove-boxing headgear. A total of 17 injury risk parameters were measured or calculated. All padding conditions reduced linear impact dosage. Other parameters significantly decreased, significantly increased, or were unaffected depending on padding condition. Of real-world conditions (MMA glove-bare head, boxing glove-bare head, and boxing glove-headgear), the boxing glove-headgear condition showed the most meaningful reduction in most of the parameters. In equivalent impacts, the MMA glove-bare head condition induced higher rotational dosage than the boxing glove-bare head condition. Finite element analysis indicated a risk of brain strain injury in spite of significant reduction of linear impact dosage. In the replicated hook punch impacts, all padding conditions reduced linear but not rotational impact dosage. Head and neck dosage theoretically accumulates fastest in MMA and boxing bouts without use of protective headgear. The boxing glove-headgear condition provided the best overall reduction in impact dosage. More work is needed to develop improved protective padding to minimize
International Nuclear Information System (INIS)
Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard
2015-01-01
Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single
Energy Technology Data Exchange (ETDEWEB)
Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)
2015-02-15
Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single
Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.
Deng, Yangqing; Pan, Wei
2017-12-01
There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration
Directory of Open Access Journals (Sweden)
Sean Ekins
Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.
Preliminary Statistics from the NASA Alphasat Beacon Receiver in Milan, Italy
Nessel, James; Zemba, Michael; Morse, Jacquelynne; Luini, Lorenzo; Riva, Carlo
2015-01-01
NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have initiated a joint propagation campaign within the framework of the Alphasat propagation experiment to characterize rain attenuation, scintillation, and gaseous absorption effects of the atmosphere in the 40 gigahertz band. NASA GRC has developed and installed a K/Q-band (20/40 gigahertz) beacon receiver at the POLIMI campus in Milan, Italy, which receives the 20/40 gigahertz signals broadcast from the Alphasat Aldo Paraboni TDP no. 5 beacon payload. The primary goal of these measurements is to develop a physical model to improve predictions of communications systems performance within the Q-band. Herein, we provide an overview of the design and data calibration procedure, and present 6 months of preliminary statistics of the NASA propagation terminal, which has been installed and operating in Milan since May 2014. The Q-band receiver has demonstrated a dynamic range of 40 decibels at an 8-hertz sampling rate. A weather station with an optical disdrometer is also installed to characterize rain drop size distribution for correlation with physical based models.
A preliminary study on identification of Thai rice samples by INAA and statistical analysis
Kongsri, S.; Kukusamude, C.
2017-09-01
This study aims to investigate the elemental compositions in 93 Thai rice samples using instrumental neutron activation analysis (INAA) and to identify rice according to their types and rice cultivars using statistical analysis. As, Mg, Cl, Al, Br, Mn, K, Rb and Zn in Thai jasmine rice and Sung Yod rice samples were successfully determined by INAA. The accuracy and precision of the INAA method were verified by SRM 1568a Rice Flour. All elements were found to be in a good agreement with the certified values. The precisions in term of %RSD were lower than 7%. The LODs were obtained in range of 0.01 to 29 mg kg-1. The concentration of 9 elements distributed in Thai rice samples was evaluated and used as chemical indicators to identify the type of rice samples. The result found that Mg, Cl, As, Br, Mn, K, Rb, and Zn concentrations in Thai jasmine rice samples are significantly different but there was no evidence that Al is significantly different from concentration in Sung Yod rice samples at 95% confidence interval. Our results may provide preliminary information for discrimination of rice samples and may be useful database of Thai rice.
McKinley, C. C.; Scudder, R.; Thomas, D. J.
2016-12-01
The Neodymium Isotopic composition (Nd IC) of oxide coatings has been applied as a tracer of water mass composition and used to address fundamental questions about past ocean conditions. The leached authigenic oxide coating from marine sediment is widely assumed to reflect the dissolved trace metal composition of the bottom water interacting with sediment at the seafloor. However, recent studies have shown that readily reducible sediment components, in addition to trace metal fluxes from the pore water, are incorporated into the bottom water, influencing the trace metal composition of leached oxide coatings. This challenges the prevailing application of the authigenic oxide Nd IC as a proxy of seawater composition. Therefore, it is important to identify the component end-members that create sediments of different lithology and determine if, or how they might contribute to the Nd IC of oxide coatings. To investigate lithologic influence on the results of sequential leaching, we selected two sites with complete bulk sediment statistical characterization. Site U1370 in the South Pacific Gyre, is predominantly composed of Rhyolite ( 60%) and has a distinguishable ( 10%) Fe-Mn Oxyhydroxide component (Dunlea et al., 2015). Site 1149 near the Izu-Bonin-Arc is predominantly composed of dispersed ash ( 20-50%) and eolian dust from Asia ( 50-80%) (Scudder et al., 2014). We perform a two-step leaching procedure: a 14 mL of 0.02 M hydroxylamine hydrochloride (HH) in 20% acetic acid buffered to a pH 4 for one hour, targeting metals bound to Fe- and Mn- oxides fractions, and a second HH leach for 12 hours, designed to remove any remaining oxides from the residual component. We analyze all three resulting fractions for a large suite of major, trace and rare earth elements, a sub-set of the samples are also analyzed for Nd IC. We use multivariate statistical analyses of the resulting geochemical data to identify how each component of the sediment partitions across the sequential
Review of Statistical Analyses Resulting from Performance of HLDWD-DWPF-005
International Nuclear Information System (INIS)
Beck, R.S.
1997-01-01
The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1 ampersand 2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of
Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.
2017-11-01
We define tephras and cryptotephras and their components (mainly ash-sized particles of glass ± crystals in distal deposits) and summarize the basis of tephrochronology as a chronostratigraphic correlational and dating tool for palaeoenvironmental, geological, and archaeological research. We then document and appraise recent advances in analytical methods used to determine the major, minor, and trace elements of individual glass shards from tephra or cryptotephra deposits to aid their correlation and application. Protocols developed recently for the electron probe microanalysis of major elements in individual glass shards help to improve data quality and standardize reporting procedures. A narrow electron beam (diameter ∼3-5 μm) can now be used to analyze smaller glass shards than previously attainable. Reliable analyses of 'microshards' (defined here as glass shards T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain multiple compositional groups. Currently, incomplete databases are tending to limit correlation efficacy. The development of an open, online global database to facilitate progress towards integrated, high
Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E
2014-02-01
Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.
Development and preliminary analyses of material balance evaluation model in nuclear fuel cycle
International Nuclear Information System (INIS)
Matsumura, Tetsuo
1994-01-01
Material balance evaluation model in nuclear fuel cycle has been developed using ORIGEN-2 code as basic engine. This model has feature of: It can treat more than 1000 nuclides including minor actinides and fission products. It has flexibility of modeling and graph output using a engineering work station. I made preliminary calculation of LWR fuel high burnup effect (reloading fuel average burnup of 60 GWd/t) on nuclear fuel cycle. The preliminary calculation shows LWR fuel high burnup has much effect on Japanese Pu balance problem. (author)
Energy Technology Data Exchange (ETDEWEB)
Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.
2009-12-17
number of zeros. Using QQ plots these data characteristics show a lack of normality from the data after contamination. Normality is improved when looking at log(CFU/cm2). Variance component analysis (VCA) and analysis of variance (ANOVA) were used to estimate the amount of variance due to each source and to determine which sources of variability were statistically significant. In general, the sampling methods interacted with the across event variability and with the across room variability. For this reason, it was decided to do analyses for each sampling method, individually. The between event variability and between room variability were significant for each method, except for the between event variability for the swabs. For both the wipes and vacuums, the within room standard deviation was much larger (26.9 for wipes and 7.086 for vacuums) than the between event standard deviation (6.552 for wipes and 1.348 for vacuums) and the between room standard deviation (6.783 for wipes and 1.040 for vacuums). Swabs between room standard deviation was 0.151, while both the within room and between event standard deviations are less than 0.10 (all measurements in CFU/cm2).
Directory of Open Access Journals (Sweden)
N. Tartaglione
2006-01-01
Full Text Available The speed of Atlantic surface depressions, occurred during the autumn and winter seasons and that lead to intense precipitation over Italy from 1951 to 2000, was investigated. Italy was divided into 5 regions as documented in previous climatological studies (based on Principal Component Analysis. Intense precipitation events were selected on the basis of in situ rain gauge data and clustered according to the region that they hit. For each intense precipitation event we tried to identify an associated surface depression and we tracked it, within a large domain covering the Mediterranean and Atlantic regions, from its formation to cyclolysis in order to estimate its speed. 'Depression speeds' were estimated with 6-h resolution and clustered into slow and non-slow classes by means of a threshold, coinciding with the first quartile of speed distribution and depression centre speeds were associated with their positions. Slow speeds occurring over an area including Italy and the western Mediterranean basin showed frequencies higher than 25%, for all the Italian regions but one. The probability of obtaining by chance the observed more than 25% success rate was estimated by means of a binomial distribution. The statistical reliability of the result is confirmed for only one region. For Italy as a whole, results were confirmed at 95% confidence level. Stability of the statistical inference, with respect to errors in estimating depression speed and changes in the threshold of slow depressions, was analysed and essentially confirmed the previous results.
Essentials of Excel, Excel VBA, SAS and Minitab for statistical and financial analyses
Lee, Cheng-Few; Chang, Jow-Ran; Tai, Tzu
2016-01-01
This introductory textbook for business statistics teaches statistical analysis and research methods via business case studies and financial data using Excel, MINITAB, and SAS. Every chapter in this textbook engages the reader with data of individual stock, stock indices, options, and futures. One studies and uses statistics to learn how to study, analyze, and understand a data set of particular interest. Some of the more popular statistical programs that have been developed to use statistical and computational methods to analyze data sets are SAS, SPSS, and MINITAB. Of those, we look at MINITAB and SAS in this textbook. One of the main reasons to use MINITAB is that it is the easiest to use among the popular statistical programs. We look at SAS because it is the leading statistical package used in industry. We also utilize the much less costly and ubiquitous Microsoft Excel to do statistical analysis, as the benefits of Excel have become widely recognized in the academic world and its analytical capabilities...
Energy Technology Data Exchange (ETDEWEB)
NONE
2011-07-01
This report is an assessment of the current model and presentation form of bio energy statistics. It appears proposed revision and enhancement of both collection and data representation. In the context of market development both in general for energy and particularly for bio energy and government targets, a good bio energy statistics form the basis to follow up the objectives and means.(eb)
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
Mach, D. A.; Blakeslee, R. J.; Bailey, J. C.; Farrell, W. M.; Goldberg, R. A.; Desch, M. D.; Houser, J. G.
2003-01-01
The Altus Cumulus Electrification Study (ACES) was conducted during the month of August, 2002 in an area near Key West, Florida. One of the goals of this uninhabited aerial vehicle (UAV) study was to collect high resolution optical pulse and electric field data from thunderstorms. During the month long campaign, we acquired 5294 lightning generated optical pulses with associated electric field changes. Most of these observations were made while close to the top of the storms. We found filtered mean and median 10-10% optical pulse widths of 875 and 830 microns respectively while the 50-50% mean and median optical pulse widths are 422 and 365 microns respectively. These values are similar to previous results as are the 10-90% mean and median rise times of 327 and 265 microns. The peak electrical to optical pulse delay mean and median were 209 and 145 microns which is longer than one would expect from theoretical results. The results of the pulse analysis will contribute to further validation of the Optical Transient Detector (OTD) and the Lightning Imaging Sensor (LIS) satellites. Pre-launch estimates of the flash detection efficiency were based on a small sample of optical pulse measurements associated with less than 350 lightning discharges collected by NASA U-2 aircraft in the early 1980s. Preliminary analyses of the ACES measurements show that we have greatly increased the number of optical pulses available for validation of the LIS and other orbital lightning optical sensors. Since the Altus was often close to the cloud tops, many of the optical pulses are from low-energy pulses. From these low-energy pulses, we can determine the fraction of optical lightning pulses below the thresholds of LIS, OTD, and any future satellite-based optical sensors such as the geostationary Lightning Mapping Sensor.
Neutronic analyses of the preliminary design of a DCLL blanket for the EUROfusion DEMO power plant
Energy Technology Data Exchange (ETDEWEB)
Palermo, Iole, E-mail: iole.palermo@ciemat.es; Fernández, Iván; Rapisarda, David; Ibarra, Angel
2016-11-01
Highlights: • We perform neutronic calculations for the preliminary DCLL Blanket design. • We study the tritium breeding capability of the reactor. • We determine the nuclear heating in the main components. • We verify if the shielding of the TF coil is maintained. - Abstract: In the frame of the newly established EUROfusion WPBB Project for the period 2014–2018, four breeding blanket options are being investigated to be used in the fusion power demonstration plant DEMO. CIEMAT is leading the development of the conceptual design of the Dual Coolant Lithium Lead, DCLL, breeding blanket. The primary role of the blanket is of energy extraction, tritium production, and radiation shielding. With this aim the DCLL uses LiPb as primary coolant, tritium breeder and neutron multiplier and Eurofer as structural material. Focusing on the achievement of the fundamental neutronic responses a preliminary blanket model has been designed. Thus detailed 3D neutronic models of the whole blanket modules have been generated, arranged in a specific DCLL segmentation and integrated in the generic DEMO model. The initial design has been studied to demonstrate its viability. Thus, the neutronic behaviour of the blanket and of the shield systems in terms of tritium breeding capabilities, power generation and shielding efficiency has been assessed in this paper. The results demonstrate that the primary nuclear performances are already satisfactory at this preliminary stage of the design, having obtained the tritium self-sufficiency and an adequate shielding.
Buttigieg, Pier Luigi; Ramette, Alban Nicolas
2014-01-01
The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynami...
DEFF Research Database (Denmark)
Denwood, M.J.; McKendrick, I.J.; Matthews, L.
Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...
International Nuclear Information System (INIS)
Kaufmann, R.K.; Kauppi, H.; Stock, J.H.
2006-01-01
Comparing statistical estimates for the long-run temperature effect of doubled CO2 with those generated by climate models begs the question, is the long-run temperature effect of doubled CO2 that is estimated from the instrumental temperature record using statistical techniques consistent with the transient climate response, the equilibrium climate sensitivity, or the effective climate sensitivity. Here, we attempt to answer the question, what do statistical analyses of the observational record measure, by using these same statistical techniques to estimate the temperature effect of a doubling in the atmospheric concentration of carbon dioxide from seventeen simulations run for the Coupled Model Intercomparison Project 2 (CMIP2). The results indicate that the temperature effect estimated by the statistical methodology is consistent with the transient climate response and that this consistency is relatively unaffected by sample size or the increase in radiative forcing in the sample
Lasaponara, Rosa; Masini, Nicola
2014-05-01
within Basilicata and Puglia Region, southern Patagonia and Payunia-Campo Volcanicos Liancanelo e PayunMatru respectively, in Italy and Argentina. We focused our attention on diverse surfaces and soil types in different periods of the year in order to assess the capabilities of both optical and radar data to detect archaeological marks in different ecosystems and seasons. We investigated not only crop culture during the "favourable vegetative period" to enhance the presence of subsurface remains but also the "spectral response" of spontaneous, sparse herbaceous covers during periods considered and expected to be less favourable (as for example summer and winter) for this type of investigation. The main interesting results were the capability of radar (cosmoskymed) and multispectral optical data satellite data (Pleiades, Quickbird, Geoeye) to highlight the presence of structures below the surface even (i) in during period of years generally considered not "suitable for crop mark investigations" and even (ii) in areas only covered by sparse, spontaneous herbaceous plants in several test sites investigate din both Argentine and Italian areas of interest. Preliminary results conducted in both Italian and Argentina sites pointed out that Earth Observation (EO) technology can be successfully used for extracting useful information on traces the past human activities still fossilized in the modern landscape in different ecosystems and seasons. Moreover the multitemporal analyses of satellite data can fruitfully applied to: (i) improve knowledge, (ii) support monitoring of natural and cultural site, (iii) assess natural and man-made risks including emerging threats to the heritage sites. References Lasaponara R, N Masini 2009 Full-waveform Airborne Laser Scanning for the detection of medieval archaeological microtopographic relief Journal of Cultural Heritage 10, e78-e82 Ciminale M, D Gallo, R Lasaponara, N Masini 2009 A multiscale approach for reconstructing archaeological
Statistical analyses of the magnet data for the advanced photon source storage ring magnets
International Nuclear Information System (INIS)
Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.
1995-01-01
The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180 degrees and 120 degrees symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
International Nuclear Information System (INIS)
Anantatmula, R.P.
1984-01-01
A preliminary sensitivity analysis was performed for the corrosion models developed for Basalt Waste Isolation Project container materials. The models describe corrosion behavior of the candidate container materials (low carbon steel and Fe9Cr1Mo), in various environments that are expected in the vicinity of the waste package, by separate equations. The present sensitivity analysis yields an uncertainty in total uniform corrosion on the basis of assumed uncertainties in the parameters comprising the corrosion equations. Based on the sample scenario and the preliminary corrosion models, the uncertainty in total uniform corrosion of low carbon steel and Fe9Cr1Mo for the 1000 yr containment period are 20% and 15%, respectively. For containment periods ≥ 1000 yr, the uncertainty in corrosion during the post-closure aqueous periods controls the uncertainty in total uniform corrosion for both low carbon steel and Fe9Cr1Mo. The key parameters controlling the corrosion behavior of candidate container materials are temperature, radiation, groundwater species, etc. Tests are planned in the Basalt Waste Isolation Project containment materials test program to determine in detail the sensitivity of corrosion to these parameters. We also plan to expand the sensitivity analysis to include sensitivity coefficients and other parameters in future studies. 6 refs., 3 figs., 9 tabs
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
International Nuclear Information System (INIS)
Beck, W.
1984-01-01
From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked
Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum
Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd
2011-01-01
The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…
Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics
DEFF Research Database (Denmark)
Khanmohammadi, Mahdieh
transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...
Statistic analyses of the color experience according to the age of the observer.
Hunjet, Anica; Parac-Osterman, Durdica; Vucaj, Edita
2013-04-01
Psychological experience of color is a real state of the communication between the environment and color, and it will depend on the source of the light, angle of the view, and particular on the observer and his health condition. Hering's theory or a theory of the opponent processes supposes that cones, which are situated in the retina of the eye, are not sensible on the three chromatic domains (areas, fields, zones) (red, green and purple-blue), but they produce a signal based on the principle of the opposed pairs of colors. A reason of this theory depends on the fact that certain disorders of the color eyesight, which include blindness to certain colors, cause blindness to pairs of opponent colors. This paper presents a demonstration of the experience of blue and yellow tone according to the age of the observer. For the testing of the statistically significant differences in the omission in the color experience according to the color of the background we use following statistical tests: Mann-Whitnney U Test, Kruskal-Wallis ANOVA and Median test. It was proven that the differences are statistically significant in the elderly persons (older than 35 years).
Statistical analyses of the performance of Macedonian investment and pension funds
Directory of Open Access Journals (Sweden)
Petar Taleski
2015-10-01
Full Text Available The foundation of the post-modern portfolio theory is creating a portfolio based on a desired target return. This specifically applies to the performance of investment and pension funds that provide a rate of return meeting payment requirements from investment funds. A desired target return is the goal of an investment or pension fund. It is the primary benchmark used to measure performances, dynamic monitoring and evaluation of the risk–return ratio on investment funds. The analysis in this paper is based on monthly returns of Macedonian investment and pension funds (June 2011 - June 2014. Such analysis utilizes the basic, but highly informative statistical characteristic moments like skewness, kurtosis, Jarque–Bera, and Chebyishev’s Inequality. The objective of this study is to perform a trough analysis, utilizing the above mentioned and other types of statistical techniques (Sharpe, Sortino, omega, upside potential, Calmar, Sterling to draw relevant conclusions regarding the risks and characteristic moments in Macedonian investment and pension funds. Pension funds are the second largest segment of the financial system, and has great potential for further growth due to constant inflows from pension insurance. The importance of investment funds for the financial system in the Republic of Macedonia is still small, although open-end investment funds have been the fastest growing segment of the financial system. Statistical analysis has shown that pension funds have delivered a significantly positive volatility-adjusted risk premium in the analyzed period more so than investment funds.
International Nuclear Information System (INIS)
Allen, Bruce; Creighton, Jolien D.E.; Flanagan, Eanna E.; Romano, Joseph D.
2003-01-01
In a previous paper (paper I), we derived a set of near-optimal signal detection techniques for gravitational wave detectors whose noise probability distributions contain non-Gaussian tails. The methods modify standard methods by truncating or clipping sample values which lie in those non-Gaussian tails. The methods were derived, in the frequentist framework, by minimizing false alarm probabilities at fixed false detection probability in the limit of weak signals. For stochastic signals, the resulting statistic consisted of a sum of an autocorrelation term and a cross-correlation term; it was necessary to discard 'by hand' the autocorrelation term in order to arrive at the correct, generalized cross-correlation statistic. In the present paper, we present an alternative derivation of the same signal detection techniques from within the Bayesian framework. We compute, for both deterministic and stochastic signals, the probability that a signal is present in the data, in the limit where the signal-to-noise ratio squared per frequency bin is small, where the signal is nevertheless strong enough to be detected (integrated signal-to-noise ratio large compared to 1), and where the total probability in the non-Gaussian tail part of the noise distribution is small. We show that, for each model considered, the resulting probability is to a good approximation a monotonic function of the detection statistic derived in paper I. Moreover, for stochastic signals, the new Bayesian derivation automatically eliminates the problematic autocorrelation term
Births: Preliminary Data for 2011. National Vital Statistics Reports. Volume 61, Number 5
Hamilton, Brady E.; Martin, Joyce A.; Ventura, Stephanie J.
2012-01-01
Objectives: This report presents preliminary data for 2011 on births in the United States. U.S. data on births are shown by age, live-birth order, race, and Hispanic origin of mother. Data on marital status, cesarean delivery, preterm births, and low birthweight are also presented. Methods: Data in this report are based on approximately 100…
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
The Use of Statistical Process Control Tools for Analysing Financial Statements
Directory of Open Access Journals (Sweden)
Niezgoda Janusz
2017-06-01
Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.
Statistical methods for analysing the relationship between bank profitability and liquidity
Boguslaw Guzik
2006-01-01
The article analyses the most popular methods for the empirical estimation of the relationship between bank profitability and liquidity. Owing to the fact that profitability depends on various factors (both economic and non-economic), a simple correlation coefficient, two-dimensional (profitability/liquidity) graphs or models where profitability depends only on liquidity variable do not provide good and reliable results. Quite good results can be obtained only when multifactorial profitabilit...
Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.
Banks, N C; Hodda, M; Singh, S K; Matveeva, E M
2012-06-01
Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.
Directory of Open Access Journals (Sweden)
Mohammad D. AL-Tahat
2012-01-01
Full Text Available This paper provides a review and introduction on agile manufacturing. Tactics of agile manufacturing are mapped into different production areas (eight-construct latent: manufacturing equipment and technology, processes technology and know-how, quality and productivity improvement, production planning and control, shop floor management, product design and development, supplier relationship management, and customer relationship management. The implementation level of agile manufacturing tactics is investigated in each area. A structural equation model is proposed. Hypotheses are formulated. Feedback from 456 firms is collected using five-point-Likert-scale questionnaire. Statistical analysis is carried out using IBM SPSS and AMOS. Multicollinearity, content validity, consistency, construct validity, ANOVA analysis, and relationships between agile components are tested. The results of this study prove that the agile manufacturing tactics have positive effect on the overall agility level. This conclusion can be used by manufacturing firms to manage challenges when trying to be agile.
DEFF Research Database (Denmark)
Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona
2015-01-01
Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...
Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples
Energy Technology Data Exchange (ETDEWEB)
Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J
2007-10-24
Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples
Preliminary analyses for HTTR`s start-up physics tests by Monte Carlo code MVP
Energy Technology Data Exchange (ETDEWEB)
Nojiri, Naoki [Science and Technology Agency, Tokyo (Japan); Nakano, Masaaki; Ando, Hiroei; Fujimoto, Nozomu; Takeuchi, Mitsuo; Fujisaki, Shingo; Yamashita, Kiyonobu
1998-08-01
Analyses of start-up physics tests for High Temperature Engineering Test Reactor (HTTR) have been carried out by Monte Carlo code MVP based on continuous energy method. Heterogeneous core structures were modified precisely, such as the fuel compacts, fuel rods, coolant channels, burnable poisons, control rods, control rod insertion holes, reserved shutdown pellet insertion holes, gaps between graphite blocks, etc. Such precise modification of the core structures was difficult with diffusion calculation. From the analytical results, the followings were confirmed; The first criticality will be achieved around 16 fuel columns loaded. The reactivity at the first criticality can be controlled by only one control rod located at the center of the core with other fifteen control rods fully withdrawn. The excess reactivity, reactor shutdown margin and control rod criticality positions have been evaluated. These results were used for planning of the start-up physics tests. This report presents analyses of start-up physics tests for HTTR by MVP code. (author)
Preliminary analyses for HTTR's start-up physics tests by Monte Carlo code MVP
International Nuclear Information System (INIS)
Nojiri, Naoki; Nakano, Masaaki; Ando, Hiroei; Fujimoto, Nozomu; Takeuchi, Mitsuo; Fujisaki, Shingo; Yamashita, Kiyonobu
1998-08-01
Analyses of start-up physics tests for High Temperature Engineering Test Reactor (HTTR) have been carried out by Monte Carlo code MVP based on continuous energy method. Heterogeneous core structures were modified precisely, such as the fuel compacts, fuel rods, coolant channels, burnable poisons, control rods, control rod insertion holes, reserved shutdown pellet insertion holes, gaps between graphite blocks, etc. Such precise modification of the core structures was difficult with diffusion calculation. From the analytical results, the followings were confirmed; The first criticality will be achieved around 16 fuel columns loaded. The reactivity at the first criticality can be controlled by only one control rod located at the center of the core with other fifteen control rods fully withdrawn. The excess reactivity, reactor shutdown margin and control rod criticality positions have been evaluated. These results were used for planning of the start-up physics tests. This report presents analyses of start-up physics tests for HTTR by MVP code. (author)
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
International Nuclear Information System (INIS)
Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo
1993-05-01
Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type ones from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air. (author)
Preservation of photographic and cinematographic films by gamma radiation: Preliminary analyses
Energy Technology Data Exchange (ETDEWEB)
Nagai, Maria Luiza E.; Santos, Paulo S.; Otubo, Larissa; Oliveira, Maria José A.; Vasquez, Pablo A.S., E-mail: malunagai@usp.br, E-mail: pavsalva@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)
2017-07-01
Brazilian weather conditions affect directly tangible materials causing deterioration notably getting worse by insects and fungi attack. In this sense, gamma radiation provided from the cobalt-60 is an excellent alternative tool to the traditional preservation process mainly because it has biocidal action. Radiation processing using gamma radiation for cultural heritage materials for disinfection has been widely used around the world in the last decades. Many cultural heritage objects especially made on paper and wood were studied in scientific publications aiming mechanical, physical and chemical properties changes. Over the last fifteen years, the Multipurpose Gamma Irradiation Facility of the Nuclear and Energy Research Institute located inside the Sao Paulo University campus has been irradiated many collections of archived materials, books, paintings and furniture. Adequate storage of photographic and cinematographic materials is a challenge for conservators from preservation institutions. Contamination by fungi is one of leading causes of problem in photographic and cinematographic collections. Several Sao Paulo University libraries have been affected by fungi in their photographic and cinematographic collections making it impossible to research on these materials either manipulate them for health and safety reasons. In this work are presented preliminary results of effects of the ionizing radiation in photographic and cinematographic films. Selected film samples made on cellulose acetate were prepared and characterized by FTIR-ATR spectroscopy. Samples were irradiated by gamma rays with absorbed dose between 2 kGy and 50 kGy. Irradiated samples were analyzed by UV-VIS spectroscopy and electron microscopy techniques. Results shown that disinfection by gamma radiation can be achieved safely applying the disinfection dose between 6 kGy to 15 kGy with no significant change or modification of main properties of the constitutive materials. (author)
Preservation of photographic and cinematographic films by gamma radiation: Preliminary analyses
International Nuclear Information System (INIS)
Nagai, Maria Luiza E.; Santos, Paulo S.; Otubo, Larissa; Oliveira, Maria José A.; Vasquez, Pablo A.S.
2017-01-01
Brazilian weather conditions affect directly tangible materials causing deterioration notably getting worse by insects and fungi attack. In this sense, gamma radiation provided from the cobalt-60 is an excellent alternative tool to the traditional preservation process mainly because it has biocidal action. Radiation processing using gamma radiation for cultural heritage materials for disinfection has been widely used around the world in the last decades. Many cultural heritage objects especially made on paper and wood were studied in scientific publications aiming mechanical, physical and chemical properties changes. Over the last fifteen years, the Multipurpose Gamma Irradiation Facility of the Nuclear and Energy Research Institute located inside the Sao Paulo University campus has been irradiated many collections of archived materials, books, paintings and furniture. Adequate storage of photographic and cinematographic materials is a challenge for conservators from preservation institutions. Contamination by fungi is one of leading causes of problem in photographic and cinematographic collections. Several Sao Paulo University libraries have been affected by fungi in their photographic and cinematographic collections making it impossible to research on these materials either manipulate them for health and safety reasons. In this work are presented preliminary results of effects of the ionizing radiation in photographic and cinematographic films. Selected film samples made on cellulose acetate were prepared and characterized by FTIR-ATR spectroscopy. Samples were irradiated by gamma rays with absorbed dose between 2 kGy and 50 kGy. Irradiated samples were analyzed by UV-VIS spectroscopy and electron microscopy techniques. Results shown that disinfection by gamma radiation can be achieved safely applying the disinfection dose between 6 kGy to 15 kGy with no significant change or modification of main properties of the constitutive materials. (author)
International Nuclear Information System (INIS)
Bithell, J.F.; Stone, R.A.
1989-01-01
This paper sets out to show that epidemiological methods most commonly used can be improved. When analysing geographical data it is necessary to consider location. The most obvious quantification of location is ranked distance, though other measures which may be more meaningful in relation to aetiology may be substituted. A test based on distance ranks, the ''Poisson maximum test'', depends on the maximum of observed relative risk in regions of increasing size, but with significance level adjusted for selection. Applying this test to data from Sellafield and Sizewell shows that the excess of leukaemia incidence observed at Seascale, near Sellafield, is not an artefact due to data selection by region, and that the excess probably results from a genuine, if as yet unidentified cause (there being little evidence of any other locational association once the Seascale cases have been removed). So far as Sizewell is concerned, geographical proximity to the nuclear power station does not seen particularly important. (author)
Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E
2014-05-01
Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.
Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.
Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi
2013-12-01
Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.
International Nuclear Information System (INIS)
Okulitch, A.V.; Laveridge, W.D.; Sullivan, R.W.
1981-01-01
The isotopic ratios resulting from Pb and U analyses on three zircon fractions from syenite gneiss intrusive into metasediments of the Shuswap Metamorphic Complex are collinear on a concordia plot and yield upper and lower intercepts of about 773 Ma and 70 Ma. The upper intercept is tentatively interpreted as the minimum age of emplacement. The lower intercept is suggested to be the time of uplift and cooling associated with tectonic denudation of the Shuswap Complex. The implied age of the country rocks is pre-late Proterozoic and they may be correlatives of the Purcell Supergroup. (auth)
Directory of Open Access Journals (Sweden)
Hudon Catherine
2006-10-01
females with more than one chronic condition were more likely to have a negative perception of their general health and mental health. Conclusion The study provides a useful preliminary measure of the importance of living arrangements and the quality of the couple relationship in chronic illness management broadly conceived as a measure of the patient's efforts at self-care and an illness status indicator. Results of this study prod us to examine more closely, within longitudinal designs, the influence of living arrangements and the presence of conflict in the couple on chronic illness management as well as the modifying effect of gender on these associations.
Directory of Open Access Journals (Sweden)
A. V. Nikitenko
2014-04-01
Full Text Available Purpose. Defining and analysis of the probabilistic and spectral characteristics of random current in regenerative braking mode of DC electric rolling stock are observed in this paper. Methodology. The elements and methods of the probability theory (particularly the theory of stationary and non-stationary processes and methods of the sampling theory are used for processing of the regenerated current data arrays by PC. Findings. The regenerated current records are obtained from the locomotives and trains in Ukraine railways and trams in Poland. It was established that the current has uninterrupted and the jumping variations in time (especially in trams. For the random current in the regenerative braking mode the functions of mathematical expectation, dispersion and standard deviation are calculated. Histograms, probabilistic characteristics and correlation functions are calculated and plotted down for this current too. It was established that the current of the regenerative braking mode can be considered like the stationary and non-ergodic process. The spectral analysis of these records and “tail part” of the correlation function found weak periodical (or low-frequency components which are known like an interharmonic. Originality. Firstly, the theory of non-stationary random processes was adapted for the analysis of the recuperated current which has uninterrupted and the jumping variations in time. Secondly, the presence of interharmonics in the stochastic process of regenerated current was defined for the first time. And finally, the patterns of temporal changes of the correlation current function are defined too. This allows to reasonably apply the correlation functions method in the identification of the electric traction system devices. Practical value. The results of probabilistic and statistic analysis of the recuperated current allow to estimate the quality of recovered energy and energy quality indices of electric rolling stock in the
Measurements and statistical analyses of indoor radon concentrations in Tokyo and surrounding areas
International Nuclear Information System (INIS)
Sugiura, Shiroharu; Suzuki, Takashi; Inokoshi, Yukio
1995-01-01
Since the UNSCEAR report published in 1982, radiation exposure to the respiratory tract due to radon and its progeny has been regarded as the single largest contributor to the natural radiation exposure of the general public. In Japan, the measurement of radon gas concentrations in many types of buildings have been surveyed by national and private institutes. We also carried out the measurement of radon gas concentrations in different types of residential buildings in Tokyo and its adjoining prefectures from October 1988 to September 1991, to evaluate the potential radiation risk of the people living there. One or two simplified passive radon monitors were set up in each of the 34 residential buildings located in the above-mentioned area for an exposure period of 3 months each. Comparing the average concentrations in the buildings of different materials and structures, those in the concrete steel buildings were always higher than those in the wooden and the prefabricated mortared buildings. The radon concentrations were proved to become higher in autumn and winter, and lower in spring and summer. Radon concentrations in an underground room of a concrete steel building showed the highest value throughout our investigation, and statistically significant seasonal variation was detected by the X-11 method developed by the U.S. Bureau of Census. The values measured in a room at the first floor of the same concrete steel building also showed seasonal variation, but the phase of variation was different. Another multivariate analysis suggested that the building material and structure are the most important factors concerning the levels of radon concentration among other factors such as the age of the building and the use of ventilators. (author)
Directory of Open Access Journals (Sweden)
Le Bao
2014-11-01
Full Text Available Endogenous retroviruses (ERVs are a class of transposable elements found in all vertebrate genomes that contribute substantially to genomic functional and structural diversity. A host species acquires an ERV when an exogenous retrovirus infects a germ cell of an individual and becomes part of the genome inherited by viable progeny. ERVs that colonized ancestral lineages are fixed in contemporary species. However, in some extant species, ERV colonization is ongoing, which results in variation in ERV frequency in the population. To study the consequences of ERV colonization of a host genome, methods are needed to assign each ERV to a location in a species’ genome and determine which individuals have acquired each ERV by descent. Because well annotated reference genomes are not widely available for all species, de novo clustering approaches provide an alternative to reference mapping that are insensitive to differences between query and reference and that are amenable to mobile element studies in both model and non-model organisms. However, there is substantial uncertainty in both identifying ERV genomic position and assigning each unique ERV integration site to individuals in a population. We present an analysis suitable for detecting ERV integration sites in species without the need for a reference genome. Our approach is based on improved de novo clustering methods and statistical models that take the uncertainty of assignment into account and yield a probability matrix of shared ERV integration sites among individuals. We demonstrate that polymorphic integrations of a recently identified endogenous retrovirus in deer reflect contemporary relationships among individuals and populations.
Preliminary results of sup(40)Ca(e,e'c) reaction analysis c p,α, based on statistical model
International Nuclear Information System (INIS)
Herdade, S.B.; Emrich, H.J.
1990-01-01
Statistical model calculations relative to the reactions sup(40)Ca (e,e'p) sup(39)K and sup(40)Ca(e,e'P sub(o)) sup(39)K sup(gs), using a modified version of the program STAPRE are compared with experimental results obtained from coincidence experiments carried out at the Mainz microtron MAMI A. Preliminary results indicate that the statistical decay of a 1 sup(-) level in the sup(40)Ca compound nucleus, at an excitation energy + 20 MeV, to the ground state of the sup(39)K residual nucleus is only about 15% of the total decay, indicating that direct and/or semi-direct mechanisms contribute to the major part of the decay. (author)
Energy Technology Data Exchange (ETDEWEB)
Dunn, Floyd E. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Wilson, Erik H. [Argonne National Lab. (ANL), Argonne, IL (United States); Sun, Kaichao S. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Newton, Jr., Thomas H. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Hu, Lin-wen [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
2013-09-30
The Massachusetts Institute of Technology Reactor (MITR-II) is a research reactor in Cambridge, Massachusetts designed primarily for experiments using neutron beam and in-core irradiation facilities. It delivers a neutron flux comparable to current LWR power reactors in a compact 6 MW core using Highly Enriched Uranium (HEU) fuel. In the framework of its non-proliferation policies, the international community presently aims to minimize the amount of nuclear material available that could be used for nuclear weapons. In this geopolitical context most research and test reactors, both domestic and international, have started a program of conversion to the use of LEU fuel. A new type of LEU fuel based on an alloy of uranium and molybdenum (U-Mo) is expected to allow the conversion of U.S. domestic high performance reactors like MITR. This report presents the preliminary accident analyses for MITR cores fueled with LEU monolithic U-Mo alloy fuel with 10 wt% Mo. Preliminary results demonstrate adequate performance, including thermal margin to expected safety limits, for the LEU accident scenarios analyzed.
International Nuclear Information System (INIS)
Zhang Mingchun; Feng Kaiming; Li Zaixin; Zhao Fengchao
2012-01-01
The neutronics schemes of the helium-cooled waste transmutation blanket, sodium-cooled waste transmutation blanket and FLiBe-cooled waste transmutation blanket were preliminarily calculated and analysed by using the spheroidal tokamak (ST) plasma configuration. The neutronics properties of these blankets' were compared and analyzed. The results show that for the transmutation of "2"3"7Np, FLiBe-cooled waste transmutation blanket has the most superior transmutation performance. The calculation results of the helium-cooled waste transmutation blanket show that this transmutation blanket can run on a steady effective multiplication factor (k_e_f_f), steady power (P), and steady tritium production rate (TBR) state for a long operating time (9.62 years) by change "2"3"7Np's initial loading rate of the minor actinides (MA). (authors)
Analysing the Severity and Frequency of Traffic Crashes in Riyadh City Using Statistical Models
Directory of Open Access Journals (Sweden)
Saleh Altwaijri
2012-12-01
Full Text Available Traffic crashes in Riyadh city cause losses in the form of deaths, injuries and property damages, in addition to the pain and social tragedy affecting families of the victims. In 2005, there were a total of 47,341 injury traffic crashes occurred in Riyadh city (19% of the total KSA crashes and 9% of those crashes were severe. Road safety in Riyadh city may have been adversely affected by: high car ownership, migration of people to Riyadh city, high daily trips reached about 6 million, high rate of income, low-cost of petrol, drivers from different nationalities, young drivers and tremendous growth in population which creates a high level of mobility and transport activities in the city. The primary objective of this paper is therefore to explore factors affecting the severity and frequency of road crashes in Riyadh city using appropriate statistical models aiming to establish effective safety policies ready to be implemented to reduce the severity and frequency of road crashes in Riyadh city. Crash data for Riyadh city were collected from the Higher Commission for the Development of Riyadh (HCDR for a period of five years from 1425H to 1429H (roughly corresponding to 2004-2008. Crash data were classified into three categories: fatal, serious-injury and slight-injury. Two nominal response models have been developed: a standard multinomial logit model (MNL and a mixed logit model to injury-related crash data. Due to a severe underreporting problem on the slight injury crashes binary and mixed binary logistic regression models were also estimated for two categories of severity: fatal and serious crashes. For frequency, two count models such as Negative Binomial (NB models were employed and the unit of analysis was 168 HAIs (wards in Riyadh city. Ward-level crash data are disaggregated by severity of the crash (such as fatal and serious injury crashes. The results from both multinomial and binary response models are found to be fairly consistent but
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples
International Nuclear Information System (INIS)
Ellingson, D.R.
1993-06-01
The review guide is based on the shared experiences, approaches, and philosophies of the Environmental Restoration/Decontamination and Decommissioning (ER/D ampersand D) subgroup members. It is presented in the form of a review guide to maximize the benefit to both the safety analyses practitioner and reviewer. The guide focuses on those challenges that tend to be unique to ER/D ampersand D cleanup activities. Some of these experiences, approaches, and philosophies may find application or be beneficial to a broader spectrum of activities such as terminal cleanout or even new operations. Challenges unique to ER/D ampersand D activities include (1) consent agreements requiring activity startup on designated dates; (2) the increased uncertainty of specific hazards; and (3) the highly variable activities covered under the broad category of ER/D ampersand D. These unique challenges are in addition to the challenges encountered in all activities; e.g., new and changing requirements and multiple interpretations. The experiences in approaches, methods, and solutions to the challenges are documented from the practitioner and reviewer's perspective, thereby providing the viewpoints on why a direction was taken and the concerns expressed. Site cleanup consent agreements with predetermined dates for restoration activity startup add the dimension of imposed punitive actions for failure to meet the date. Approval of the safety analysis is a prerequisite to startup. Actions that increase expediency are (1) assuring activity safety; (2) documenting that assurance; and (3) acquiring the necessary approvals. These actions increase the timeliness of startup and decrease the potential for punitive action. Improvement in expediency has been achieved by using safety analysis techniques to provide input to the line management decision process rather than as a review of line management decisions. Expediency is also improved by sharing the safety input and resultant decisions with
Directory of Open Access Journals (Sweden)
Yu-Chai Lai
2015-10-01
Full Text Available Transmedia narratives are a key topic of communication research. Transmedia adaptations occur when a narrator transposes an original work by using various media platforms. Considering pictorial turn, this study employed the intermediality of an animated work that was adapted from an award-winning picture book as an example to propose an analysis for theory building. After examining the literature on transmedia narratives, intermediality, and aesthetic communication, this study proposed analyses for the dimensions of transmedia adaptations of pictorial narratives and aesthetic pole interpretations. Focusing on the three layers of the artistic pole, aesthetic pole, and interactions and effects, this study cited the cases of award-winning picture books and adapted animated works as the basis for reflecting on aesthetic communication. The artistic pole of transmedia adaptation was used as an example of how a narrator employs intermediality by citing the pictures, plot, or art forms from an original work to reinvent structures and then adapts them according to intermediality. If the aesthetic pole views the adapted animated work after reading the picture book or forms an expectation of the adaptation because of intermediality, then when the audience watches the adapted animated work, their imagination could be stimulated by the intermediality (i.e., picture book graphics and scene depictions. For transmedia narrative interaction, the aesthetic pole must “fill blanks” or “negate” to continue to view the adaptation. For filling blanks, because of intermediality, the aesthetic pole must construct an “intracompositional intermediality” to connect with the visual and audio links of the same work or construct “extracompositional intermedialtiy” to associate the original work with the transmedia adaptation. For negation, when viewing the adaptation (i.e., presentation of picture book graphics, added music, or theme song, the aesthetic pole
Roberts, James S.
Stone and colleagues (C. Stone, R. Ankenman, S. Lane, and M. Liu, 1993; C. Stone, R. Mislevy and J. Mazzeo, 1994; C. Stone, 2000) have proposed a fit index that explicitly accounts for the measurement error inherent in an estimated theta value, here called chi squared superscript 2, subscript i*. The elements of this statistic are natural…
Directory of Open Access Journals (Sweden)
Gabriel Fricout
2011-05-01
Full Text Available The normal human adult kidney contains between 300,000 and 1 million nephrons (the functional units of the kidney. Nephrons develop at the tips of the branching ureteric duct, and therefore ureteric duct branching morphogenesis is critical for normal kidney development. Current methods for analysing ureteric branching are mostly qualitative and those quantitative methods that do exist do not account for the 3- dimensional (3D shape of the ureteric "tree". We have developed a method for measuring the total length of the ureteric tree in 3D. This method is described and preliminary data are presented. The algorithm allows for performing a semi-automatic segmentation of a set of grey level confocal images and an automatic skeletonisation of the resulting binary object. Measurements of length are automatically obtained, and numbers of branch points are manually counted. The final representation can be reconstructed by means of 3D volume rendering software, providing a fully rotating 3D perspective of the skeletonised tree, making it possible to identify and accurately measure branch lengths. Preliminary data shows the total length estimates obtained with the technique to be highly reproducible. Repeat estimates of total tree length vary by just 1-2%. We will now use this technique to further define the growth of the ureteric tree in vitro, under both normal culture conditions, and in the presence of various levels of specific molecules suspected of regulating ureteric growth. The data obtained will provide fundamental information on the development of renal architecture, as well as the regulation of nephron number.
Energy Technology Data Exchange (ETDEWEB)
Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis; Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis
2016-09-01
Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.
Directory of Open Access Journals (Sweden)
Samek Lucyna
2016-03-01
Full Text Available Samples of PM10 and PM2.5 fractions were collected between the years 2010 and 2013 at the urban area of Krakow, Poland. Numerous types of air pollution sources are present at the site; these include steel and cement industries, traffic, municipal emission sources and biomass burning. Energy dispersive X-ray fluorescence was used to determine the concentrations of the following elements: Cl, K, Ca, Ti, Mn, Fe, Ni, Cu, Zn, Br, Rb, Sr, As and Pb within the collected samples. Defining the elements as indicators, airborne particulate matter (APM source profiles were prepared by applying principal component analysis (PCA, factor analysis (FA and multiple linear regression (MLR. Four different factors identifying possible air pollution sources for both PM10 and PM2.5 fractions were attributed to municipal emissions, biomass burning, steel industry, traffic, cement and metal industry, Zn and Pb industry and secondary aerosols. The uncertainty associated with each loading was determined by a statistical simulation method that took into account the individual elemental concentrations and their corresponding uncertainties. It will be possible to identify two or more sources of air particulate matter pollution for a single factor in case it is extremely difficult to separate the sources.
International Nuclear Information System (INIS)
Wang Yuming; Cao Hao; Chen Junhong; Zhang Tengfei; Yu Sijie; Zheng Huinan; Shen Chenglong; Wang, S.; Zhang Jie
2010-01-01
In this paper, we present an automated system, which has the capability to catch and track solar limb prominences based on observations from the extreme-ultraviolet (EUV) 304 A passband. The characteristic parameters and their evolution, including height, position angle, area, length, and brightness, are obtained without manual interventions. By applying the system to the STEREO-B/SECCHI/EUVI 304 A data during 2007 April-2009 October, we obtain a total of 9477 well-tracked prominences and a catalog of these events available online. A detailed analysis of these prominences suggests that the system has a rather good performance. We have obtained several interesting statistical results based on the catalog. Most prominences appear below the latitude of 60 0 and at the height of about 26 Mm above the solar surface. Most of them are quite stable during the period they are tracked. Nevertheless, some prominences have an upward speed of more than 100 km s -1 , and some others show significant downward and/or azimuthal speeds. There are strong correlations among the brightness, area, and height. The expansion of a prominence is probably one major cause of its fading during the rising or erupting process.
Statistical analysis of nuclear power plant pump failure rate variability: some preliminary results
International Nuclear Information System (INIS)
Martz, H.F.; Whiteman, D.E.
1984-02-01
In-Plant Reliability Data System (IPRDS) pump failure data on over 60 selected pumps in four nuclear power plants are statistically analyzed using the Failure Rate Analysis Code (FRAC). A major purpose of the analysis is to determine which environmental, system, and operating factors adequately explain the variability in the failure data. Catastrophic, degraded, and incipient failure severity categories are considered for both demand-related and time-dependent failures. For catastrophic demand-related pump failures, the variability is explained by the following factors listed in their order of importance: system application, pump driver, operating mode, reactor type, pump type, and unidentified plant-specific influences. Quantitative failure rate adjustments are provided for the effects of these factors. In the case of catastrophic time-dependent pump failures, the failure rate variability is explained by three factors: reactor type, pump driver, and unidentified plant-specific influences. Finally, point and confidence interval failure rate estimates are provided for each selected pump by considering the influential factors. Both types of estimates represent an improvement over the estimates computed exclusively from the data on each pump
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.
Hopewell, Sally; Witt, Claudia M; Linde, Klaus; Icke, Katja; Adedire, Olubusola; Kirtley, Shona; Altman, Douglas G
2018-01-11
Selective reporting of outcomes in clinical trials is a serious problem. We aimed to investigate the influence of the peer review process within biomedical journals on reporting of primary outcome(s) and statistical analyses within reports of randomised trials. Each month, PubMed (May 2014 to April 2015) was searched to identify primary reports of randomised trials published in six high-impact general and 12 high-impact specialty journals. The corresponding author of each trial was invited to complete an online survey asking authors about changes made to their manuscript as part of the peer review process. Our main outcomes were to assess: (1) the nature and extent of changes as part of the peer review process, in relation to reporting of the primary outcome(s) and/or primary statistical analysis; (2) how often authors followed these requests; and (3) whether this was related to specific journal or trial characteristics. Of 893 corresponding authors who were invited to take part in the online survey 258 (29%) responded. The majority of trials were multicentre (n = 191; 74%); median sample size 325 (IQR 138 to 1010). The primary outcome was clearly defined in 92% (n = 238), of which the direction of treatment effect was statistically significant in 49%. The majority responded (1-10 Likert scale) they were satisfied with the overall handling (mean 8.6, SD 1.5) and quality of peer review (mean 8.5, SD 1.5) of their manuscript. Only 3% (n = 8) said that the editor or peer reviewers had asked them to change or clarify the trial's primary outcome. However, 27% (n = 69) reported they were asked to change or clarify the statistical analysis of the primary outcome; most had fulfilled the request, the main motivation being to improve the statistical methods (n = 38; 55%) or avoid rejection (n = 30; 44%). Overall, there was little association between authors being asked to make this change and the type of journal, intervention, significance of the
Directory of Open Access Journals (Sweden)
Viviane Moura Rocha
2015-04-01
Full Text Available This paper presents a topographic analysis of the fields of consumer loyalty and loyalty programs, vastly studied in the last decades and still relevant in the marketing literature. After the identification of 250 scientific papers that were published in the last ten years in indexed journals, a subset of 76 were chosen and their 3223 references were extracted. The journals in which these papers were published, their key words, abstracts, authors, institutions of origin and citation patterns were identified and analyzed using bibliometrics, spatial statistics techniques and network analyses. The results allow the identification of the central components of the field, as well as its main authors, journals, institutions and countries that intermediate the diffusion of knowledge, which contributes to the understanding of the constitution of the field by researchers and students.
Energy Technology Data Exchange (ETDEWEB)
Hermanson, Jan; Forssberg, Ola [Golder Associates AB, Stockholm (Sweden); Fox, Aaron; La Pointe, Paul [Golder Associates Inc., Redmond, WA (United States)
2005-10-15
The goal of this summary report is to document the data sources, software tools, experimental methods, assumptions, and model parameters in the discrete-fracture network (DFN) model for the local model volume in Laxemar, version 1.2. The model parameters presented herein are intended for use by other project modeling teams. Individual modeling teams may elect to simplify or use only a portion of the DFN model, depending on their needs. This model is not intended to be a flow model or a mechanical model; as such, only the geometrical characterization is presented. The derivations of the hydraulic or mechanical properties of the fractures or their subsurface connectivities are not within the scope of this report. This model represents analyses carried out on particular data sets. If additional data are obtained, or values for existing data are changed or excluded, the conclusions reached in this report, and the parameter values calculated, may change as well. The model volume is divided into two subareas; one located on the Simpevarp peninsula adjacent to the power plant (Simpevarp), and one further to the west (Laxemar). The DFN parameters described in this report were determined by analysis of data collected within the local model volume. As such, the final DFN model is only valid within this local model volume and the modeling subareas (Laxemar and Simpevarp) within.
Buttigieg, Pier Luigi; Ramette, Alban
2014-12-01
The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.
Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.
2012-12-01
Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent
Energy Technology Data Exchange (ETDEWEB)
Mendell, Mark J.; Cozen, Myrna
2002-09-01
The authors assessed relationships between health symptoms in office workers and risk factors related to moisture and contamination, using data collected from a representative sample of U.S. office buildings in the U.S. EPA BASE study. Methods: Analyses assessed associations between three types of weekly, workrelated symptoms-lower respiratory, mucous membrane, and neurologic-and risk factors for moisture or contamination in these office buildings. Multivariate logistic regression models were used to estimate the strength of associations for these risk factors as odds ratios (ORs) adjusted for personal-level potential confounding variables related to demographics, health, job, and workspace. A number of risk factors were associated (e.g., 95% confidence limits excluded 1.0) significantly with small to moderate increases in one or more symptom outcomes. Significantly elevated ORs for mucous membrane symptoms were associated with the following risk factors: presence of humidification system in good condition versus none (OR = 1.4); air handler inspection annually versus daily (OR = 1.6); current water damage in the building (OR = 1.2); and less than daily vacuuming in study space (OR = 1.2). Significantly elevated ORs for lower respiratory symptoms were associated with: air handler inspection annually versus daily (OR = 2.0); air handler inspection less than daily but at least semi-annually (OR=1.6); less than daily cleaning of offices (1.7); and less than daily vacuuming of the study space (OR = 1.4). Only two statistically significant risk factors for neurologic symptoms were identified: presence of any humidification system versus none (OR = 1.3); and less than daily vacuuming of the study space (OR = 1.3). Dirty cooling coils, dirty or poorly draining drain pans, and standing water near outdoor air intakes, evaluated by inspection, were not identified as risk factors in these analyses, despite predictions based on previous findings elsewhere, except that very
Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.
2018-01-01
Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
International Nuclear Information System (INIS)
Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T.; Van Laere, K.; Jamart, J.; D'Asseler, Y.; Minoshima, S.
2009-01-01
Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine 99m Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)
Energy Technology Data Exchange (ETDEWEB)
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Ferraccioli, Fausto; Anderson, Lester; Jamieson, Stewart; Bell, Robin; Rose, Kathryn; Jordan, Tom; Finn, Carol; Damaske, Detlef
2013-04-01
whether the modern Gamburtsevs may have been uplifted solely in response to significant changes in Cenozoic erosion patterns during the early stages of East Antarctic ice sheet formation that were superimposed upon an old remnant Permian-age rift flank. To address this question we combine results from: i) analyses of the subglacial landscape of the GSM that includes valley network, hyposometry and geomorphic studies of the fluvial and glacial features identified within the range (Rose et al., 2013 EPSL in review) with; ii) preliminary flexural models of peak uplift caused by the isostatic responses to fluvial and glacial valley incision processes, both within the range and in the adjacent Lambert Glacier region. We also include in our geophysical relief and isostatic model calculations considerations on the major change in erosion rates since Oligocene times and the total amount of incision estimated for the Lambert Glacier system using the values proposed by Tochlin et al. (2012). Our models yield new estimates of peak uplift and regional lowering for continuous and broken-plate approximations that can also be used to assess the range of pre-incision elevation of the "Gamburtsev plateau". Our modelling outputs were also calibrated against the present-day elevations of up to 1500 m a.s.l of uplifted Oligocene-early Miocene glacial-marine sediments in the Lambert Glacier (Hambrey et al., 2000, Geology).
Melsen, W G; Rovers, M M; Bonten, M J M; Bootsma, M C J|info:eu-repo/dai/nl/304830305
Variance between studies in a meta-analysis will exist. This heterogeneity may be of clinical, methodological or statistical origin. The last of these is quantified by the I(2) -statistic. We investigated, using simulated studies, the accuracy of I(2) in the assessment of heterogeneity and the
International Nuclear Information System (INIS)
1992-12-01
This report describes preliminary probabilistic sensitivity analyses of long term gas and brine migration at the Waste Isolation Pilot Plant (WIPP). Because gas and brine are potential transport media for organic compounds and heavy metals, understanding two-phase flow in the repository and the surrounding Salado Formation is essential to evaluating long-term compliance with 40 CFR 268.6, which is the portion of the Land Disposal Restrictions of the Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act that states the conditions for disposal of specified hazardous wastes. Calculations described here are designed to provide guidance to the WIPP Project by identifying important parameters and helping to recognize processes not yet modeled that may affect compliance. Based on these analyses, performance is sensitive to shaft-seal permeabilities, parameters affecting gas generation, and the conceptual model used for the disturbed rock zone surrounding the excavation. Brine migration is less likely to affect compliance with 40 CFR 268.6 than gas migration. However, results are preliminary, and additional iterations of uncertainty and sensitivity analyses will be required to provide the confidence needed for a defensible compliance evaluation. Specifically, subsequent analyses will explicitly include effects of salt creep and, when conceptual and computational models are available, pressure-dependent fracturing of anhydrite marker beds
Berto, Daniela; Rampazzo, Federico; Gion, Claudia; Noventa, Seta; Ronchi, Francesca; Traldi, Umberto; Giorgi, Giordano; Cicero, Anna Maria; Giovanardi, Otello
2017-06-01
Plastic waste is a growing global environmental problem, particularly in the marine ecosystems, in consideration of its persistence. The monitoring of the plastic waste has become a global issue, as reported by several surveillance guidelines proposed by Regional Sea Conventions (OSPAR, UNEP) and appointed by the EU Marine Strategy Framework Directive. Policy responses to plastic waste vary at many levels, ranging from beach clean-up to bans on the commercialization of plastic bags and to Regional Plans for waste management and recycling. Moreover, in recent years, the production of plant-derived biodegradable plastic polymers has assumed increasing importance. This study reports the first preliminary characterization of carbon stable isotopes (δ 13 C) of different plastic polymers (petroleum- and plant-derived) in order to increase the dataset of isotopic values as a tool for further investigation in different fields of polymers research as well as in the marine environment surveillance. The δ 13 C values determined in different packaging for food uses reflect the plant origin of "BIO" materials, whereas the recycled plastic materials displayed a δ 13 C signatures between plant- and petroleum-derived polymers source. In a preliminary estimation, the different colours of plastic did not affect the variability of δ 13 C values, whereas the abiotic and biotic degradation processes that occurred in the plastic materials collected on beaches and in seawater, showed less negative δ 13 C values. A preliminary experimental field test confirmed these results. The advantages offered by isotope ratio mass spectrometry with respect to other analytical methods used to characterize the composition of plastic polymers are: high sensitivity, small amount of material required, rapidity of analysis, low cost and no limitation in black/dark samples compared with spectroscopic analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi
2009-01-01
Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)
Tay, Louis; Drasgow, Fritz
2012-01-01
Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…
Energy Technology Data Exchange (ETDEWEB)
Lee, Jong Youl; Lee, Min Soo; Choi, Heui Joo; Kim, Geon Young; Kim, Kyung Su [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2016-06-15
Spent fuels from nuclear power plants, as well as high-level radioactive waste from the recycling of spent fuels, should be safely isolated from human environment for an extremely long time. Recently, meaningful studies on the development of deep borehole radioactive waste disposal system in 3-5 km depth have been carried out in USA and some countries in Europe, due to great advance in deep borehole drilling technology. In this paper, domestic deep geoenvironmental characteristics are preliminarily investigated to analyze the applicability of deep borehole disposal technology in Korea. To do this, state-of-the art technologies in USA and some countries in Europe are reviewed, and geological and geothermal data from the deep boreholes for geothermal usage are analyzed. Based on the results on the crystalline rock depth, the geothermal gradient and the spent fuel types generated in Korea, a preliminary deep borehole concept including disposal canister and sealing system, is suggested.
International Nuclear Information System (INIS)
Lee, Jong Youl; Lee, Min Soo; Choi, Heui Joo; Kim, Geon Young; Kim, Kyung Su
2016-01-01
Spent fuels from nuclear power plants, as well as high-level radioactive waste from the recycling of spent fuels, should be safely isolated from human environment for an extremely long time. Recently, meaningful studies on the development of deep borehole radioactive waste disposal system in 3-5 km depth have been carried out in USA and some countries in Europe, due to great advance in deep borehole drilling technology. In this paper, domestic deep geoenvironmental characteristics are preliminarily investigated to analyze the applicability of deep borehole disposal technology in Korea. To do this, state-of-the art technologies in USA and some countries in Europe are reviewed, and geological and geothermal data from the deep boreholes for geothermal usage are analyzed. Based on the results on the crystalline rock depth, the geothermal gradient and the spent fuel types generated in Korea, a preliminary deep borehole concept including disposal canister and sealing system, is suggested
Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos
2011-03-01
The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.
Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.
2015-03-01
Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.
Energy Technology Data Exchange (ETDEWEB)
Reddy, T.A. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States)); Claridge, D.E. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States))
1994-01-01
Multiple regression modeling of monitored building energy use data is often faulted as a reliable means of predicting energy use on the grounds that multicollinearity between the regressor variables can lead both to improper interpretation of the relative importance of the various physical regressor parameters and to a model with unstable regressor coefficients. Principal component analysis (PCA) has the potential to overcome such drawbacks. While a few case studies have already attempted to apply this technique to building energy data, the objectives of this study were to make a broader evaluation of PCA and multiple regression analysis (MRA) and to establish guidelines under which one approach is preferable to the other. Four geographic locations in the US with different climatic conditions were selected and synthetic data sequence representative of daily energy use in large institutional buildings were generated in each location using a linear model with outdoor temperature, outdoor specific humidity and solar radiation as the three regression variables. MRA and PCA approaches were then applied to these data sets and their relative performances were compared. Conditions under which PCA seems to perform better than MRA were identified and preliminary recommendations on the use of either modeling approach formulated. (orig.)
Directory of Open Access Journals (Sweden)
Putnik-Delić Marina I.
2010-01-01
Full Text Available Eleven sugar beet genotypes were tested for their capacity to tolerate drought. Plants were grown in semi-controlled conditions, in the greenhouse, and watered daily. After 90 days, water deficit was imposed by the cessation of watering, while the control plants continued to be watered up to 80% of FWC. Five days later concentration of free proline in leaves was determined. Analysis was done in three replications. Statistical analysis was performed using STATISTICA 9.0, Minitab 15, and R2.11.1. Differences between genotypes were statistically processed by Duncan test. Because of nonormality of the data distribution and heterogeneity of variances in different groups, two types of transformations of row data were applied. For this type of data more appropriate in eliminating nonormality was Johnson transformation, as opposed to Box-Cox. Based on the both transformations it may be concluded that in all genotypes except for 10, concentration of free proline differs significantly between treatment (drought and the control.
Directory of Open Access Journals (Sweden)
Carles Comas
2015-04-01
Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.
Energy Technology Data Exchange (ETDEWEB)
La Pointe, Paul R. [Golder Associate Inc., Redmond, WA (United States); Olofsson, Isabelle; Hermanson, Jan [Golder Associates AB, Uppsala (Sweden)
2005-04-01
Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the
International Nuclear Information System (INIS)
La Pointe, Paul R.; Olofsson, Isabelle; Hermanson, Jan
2005-04-01
Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were
Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert
2007-03-01
This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.
International Nuclear Information System (INIS)
Heard, F.J.
1997-01-01
The SAS4A/SASSYS-l computer code is used to perform a series of analyses for the limiting protected design basis transient events given a representative tritium and medical isotope production core design proposed for the Fast Flux Test Facility. The FFTF tritium and isotope production mission will require a different core loading which features higher enrichment fuel, tritium targets, and medical isotope production assemblies. Changes in several key core parameters, such as the Doppler coefficient and delayed neutron fraction will affect the transient response of the reactor. Both reactivity insertion and reduction of heat removal events were analyzed. The analysis methods and modeling assumptions are described. Results of the analyses and comparison against fuel pin performance criteria are presented to provide quantification that the plant protection system is adequate to maintain the necessary safety margins and assure cladding integrity
Ortolano, Gaetano; Visalli, Roberto; Godard, Gaston; Cirrincione, Rosolino
2018-06-01
We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.
Directory of Open Access Journals (Sweden)
Jing-Chzi Hsieh
2016-05-01
Full Text Available The recycled Kevlar®/polyester/low-melting-point polyester (recycled Kevlar®/PET/LPET nonwoven geotextiles are immersed in neutral, strong acid, and strong alkali solutions, respectively, at different temperatures for four months. Their tensile strength is then tested according to various immersion periods at various temperatures, in order to determine their durability to chemicals. For the purpose of analyzing the possible factors that influence mechanical properties of geotextiles under diverse environmental conditions, the experimental results and statistical analyses are incorporated in this study. Therefore, influences of the content of recycled Kevlar® fibers, implementation of thermal treatment, and immersion periods on the tensile strength of recycled Kevlar®/PET/LPET nonwoven geotextiles are examined, after which their influential levels are statistically determined by performing multiple regression analyses. According to the results, the tensile strength of nonwoven geotextiles can be enhanced by adding recycled Kevlar® fibers and thermal treatment.
International Nuclear Information System (INIS)
Ghantous, N.Y.; Raines, G.E.
1987-10-01
This report presents thermal/thermomechanical analyses of the Site Characterization Plan Conceptual Design for horizontal package emplacement at the Deaf Smith County site, Texas. The repository was divided into three geometric regions. Then two-dimensional finite-element models were set up to approximate the three-dimensional nature of each region. Thermal and quasistatic thermomechanical finite-element analyses were performed to evaluate the thermal/thermomechanical responses of the three regions. The exponential-time creep law was used to represent the creep behavior of salt rock. The repository design was evaluated by comparing the thermal/thermomechanical responses obtained for the three regions with interim performance constraints. The preliminary results show that all the performance constraints are met except for those of the waste package. The following factors were considered in interpreting these results: (1) the qualitative description of the analytical responses; (2) the limitations of the analyses; and (3) either the conclusions based on overall evaluation of limitations and analytical results or the conclusions based on the fact that the repository design may be evaluated only after further analyses. Furthermore, a parametric analysis was performed to estimate the effect of material parameters on the predicted thermal/thermomechanical response. 23 refs., 34 figs., 9 tabs
Yokoyama, Shozo; Takenaka, Naomi
2005-04-01
Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.
Wafer, Lucas; Kloczewiak, Marek; Luo, Yin
2016-07-01
Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.
Gregoire, Alexandre David
2011-07-01
The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.
International Nuclear Information System (INIS)
Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo
2015-01-01
Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.
Energy Technology Data Exchange (ETDEWEB)
Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)
2015-08-15
Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.
Deng, Xueliang; Nie, Suping; Deng, Weitao; Cao, Weihua
2018-04-01
In this study, we compared the following four different gridded monthly precipitation products: the National Centers for Environmental Prediction version 2 (NCEP-2) reanalysis data, the satellite-based Climate Prediction Center Morphing technique (CMORPH) data, the merged satellite-gauge Global Precipitation Climatology Project (GPCP) data, and the merged satellite-gauge-model data from the Beijing Climate Center Merged Estimation of Precipitation (BMEP). We evaluated the performances of these products using monthly precipitation observations spanning the period of January 2003 to December 2013 from a dense, national, rain gauge network in China. Our assessment involved several statistical techniques, including spatial pattern, temporal variation, bias, root-mean-square error (RMSE), and correlation coefficient (CC) analysis. The results show that NCEP-2, GPCP, and BMEP generally overestimate monthly precipitation at the national scale and CMORPH underestimates it. However, all of the datasets successfully characterized the northwest to southeast increase in the monthly precipitation over China. Because they include precipitation gauge information from the Global Telecommunication System (GTS) network, GPCP and BMEP have much smaller biases, lower RMSEs, and higher CCs than NCEP-2 and CMORPH. When the seasonal and regional variations are considered, NCEP-2 has a larger error over southern China during the summer. CMORPH poorly reproduces the magnitude of the precipitation over southeastern China and the temporal correlation over western and northwestern China during all seasons. BMEP has a lower RMSE and higher CC than GPCP over eastern and southern China, where the station network is dense. In contrast, BMEP has a lower CC than GPCP over western and northwestern China, where the gauge network is relatively sparse.
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
International Nuclear Information System (INIS)
Amorim, E.S. do; Castro Lobo, P.D. de.
1980-11-01
A reduction of computing effort was achieved as a result of the application of space - independent continuous slowing down theory in the spectrum averaged cross sections and further expressing then in a quadratic corelation whith the temperature and the composition. The decoupling between variables that express some of the important nuclear characteristics allowed to introduce a sensitivity analyses treatment for the full prediction of the behavior, over the fuel cycle, of the LMFBR considered. As a potential application of the method here in developed is to predict the nuclear characteristics of another reactor, face some reference reactor of the family considered. Excellent agreement with exact calculation is observed only when perturbations occur in nuclear data and/or fuel isotopic characteristics, but fair results are obtained whith variations in system components other than the fuel. (Author) [pt
Directory of Open Access Journals (Sweden)
Wang Xiaoqiang
2012-04-01
Full Text Available Abstract Background Quantitative trait loci (QTL detection on a huge amount of phenotypes, like eQTL detection on transcriptomic data, can be dramatically impaired by the statistical properties of interval mapping methods. One of these major outcomes is the high number of QTL detected at marker locations. The present study aims at identifying and specifying the sources of this bias, in particular in the case of analysis of data issued from outbred populations. Analytical developments were carried out in a backcross situation in order to specify the bias and to propose an algorithm to control it. The outbred population context was studied through simulated data sets in a wide range of situations. The likelihood ratio test was firstly analyzed under the "one QTL" hypothesis in a backcross population. Designs of sib families were then simulated and analyzed using the QTL Map software. On the basis of the theoretical results in backcross, parameters such as the population size, the density of the genetic map, the QTL effect and the true location of the QTL, were taken into account under the "no QTL" and the "one QTL" hypotheses. A combination of two non parametric tests - the Kolmogorov-Smirnov test and the Mann-Whitney-Wilcoxon test - was used in order to identify the parameters that affected the bias and to specify how much they influenced the estimation of QTL location. Results A theoretical expression of the bias of the estimated QTL location was obtained for a backcross type population. We demonstrated a common source of bias under the "no QTL" and the "one QTL" hypotheses and qualified the possible influence of several parameters. Simulation studies confirmed that the bias exists in outbred populations under both the hypotheses of "no QTL" and "one QTL" on a linkage group. The QTL location was systematically closer to marker locations than expected, particularly in the case of low QTL effect, small population size or low density of markers, i
Directory of Open Access Journals (Sweden)
Deverick J Anderson
Full Text Available The rate of community-acquired Clostridium difficile infection (CA-CDI is increasing. While receipt of antibiotics remains an important risk factor for CDI, studies related to acquisition of C. difficile outside of hospitals are lacking. As a result, risk factors for exposure to C. difficile in community settings have been inadequately studied.To identify novel environmental risk factors for CA-CDI.We performed a population-based retrospective cohort study of patients with CA-CDI from 1/1/2007 through 12/31/2014 in a 10-county area in central North Carolina. 360 Census Tracts in these 10 counties were used as the demographic Geographic Information System (GIS base-map. Longitude and latitude (X, Y coordinates were generated from patient home addresses and overlaid to Census Tracts polygons using ArcGIS; ArcView was used to assess "hot-spots" or clusters of CA-CDI. We then constructed a mixed hierarchical model to identify environmental variables independently associated with increased rates of CA-CDI.A total of 1,895 unique patients met our criteria for CA-CDI. The mean patient age was 54.5 years; 62% were female and 70% were Caucasian. 402 (21% patient addresses were located in "hot spots" or clusters of CA-CDI (p<0.001. "Hot spot" census tracts were scattered throughout the 10 counties. After adjusting for clustering and population density, age ≥ 60 years (p = 0.03, race (<0.001, proximity to a livestock farm (0.01, proximity to farming raw materials services (0.02, and proximity to a nursing home (0.04 were independently associated with increased rates of CA-CDI.Our study is the first to use spatial statistics and mixed models to identify important environmental risk factors for acquisition of C. difficile and adds to the growing evidence that farm practices may put patients at risk for important drug-resistant infections.
Calisto, A; Bramanti, A; Galeano, M; Angileri, F; Campobello, G; Serrano, S; Azzerboni, B
2009-01-01
The objective of this study is to investigate Id-iopatic Normal Pressure Hydrocephalus (INPH) through a multidimensional and multiparameter analysis of statistical data obtained from accurate analysis of Intracranial Pressure (ICP) recordings. Such a study could permit to detect new factors, correlated with therapeutic response, which are able to validate a predicting significance for infusion test. The algorithm developed by the authors computes 13 ICP parameter trends on each of the recording, afterward 9 statistical information from each trend is determined. All data are transferred to the datamining software WEKA. According to the exploited feature-selection techniques, the WEKA has revealed that the most significant statistical parameter is the maximum of Single-Wave-Amplitude: setting a 27 mmHg threshold leads to over 90% of correct classification.
International Nuclear Information System (INIS)
Bennett, J.T.; Crowder, C.A.; Connolly, M.J.
1994-01-01
Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
1993-08-01
Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 contains an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.
International Nuclear Information System (INIS)
Beckman, R.; Thomas, K.; Crowe, B.
1988-05-01
This report studies the transport of radionuclides from a repository to the environment by dissolution of the stored solid-waste form and subsequent transport in water. The sorption process may retard this movement of radionuclides from the repository to the accessible environment. A measure of this retardation process is the sorption ratio, R/sub D/, where R/sub D/ = (activity in solid phase per unit mass of solid)(activity in solution per unit volume of solution). In this study, predictions of the R/sub D/ values for the elements barium, cerium, cesium, europium, and strontium are developed from linear regression techniques. An R/sub D/ value was obtained for numerous drill core samples. Additional data include the particle size of the rock, temperature condition during the experiment, concentration of the sorbing element, and length of the sorption experiment. Preliminary regression results based on these data show that the temperature and length of the experiment are the most significant factors influencing the R/sub D/ values. Particle size has a slight effect, and based on a small amount of data, it appears that concentration had no effect. The x-ray diffraction data are used to classify the samples by mineralogy, and regression techniques are used to develop estimates of the R/sub D/ values. Zeolite abundance of 10% or more with some addition of clay increases the sorption values significantly. 12 refs., 3 figs., 6 tabs
Energy Technology Data Exchange (ETDEWEB)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2013-10-15
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.
International Nuclear Information System (INIS)
Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man
2013-01-01
The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis
International Nuclear Information System (INIS)
Lee, Tae-Hong; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Kyung-Pil
2007-01-01
Statistical parametric mapping (SPM) and statistical probabilistic anatomical mapping (SPAM) were applied to basal/acetazolamide Tc-99m ECD brain perfusion SPECT images in patients with middle cerebral artery (MCA) stenosis to assess the efficacy of endovascular stenting of the MCA. Enrolled in the study were 11 patients (8 men and 3 women, mean age 54.2 ± 6.2 years) who had undergone endovascular stent placement for MCA stenosis. Using SPM and SPAM analyses, we compared the number of significant voxels and cerebral counts in basal and acetazolamide SPECT images before and after stenting, and assessed the perfusion changes and cerebral vascular reserve index (CVRI). The numbers of hypoperfusion voxels in SPECT images were decreased from 10,083 ± 8,326 to 4,531 ± 5,091 in basal images (P 0.0317) and from 13,398 ± 14,222 to 7,699 ± 10,199 in acetazolamide images (P = 0.0142) after MCA stenting. On SPAM analysis, the increases in cerebral counts were significant in acetazolamide images (90.9 ± 2.2 to 93.5 ± 2.3, P = 0.0098) but not in basal images (91 ± 2.7 to 92 ± 2.6, P = 0.1602). The CVRI also showed a statistically significant increase from before stenting (median 0.32; 95% CI -2.19-2.37) to after stenting (median 1.59; 95% CI -0.85-4.16; P = 0.0068). This study revealed the usefulness of voxel-based analysis of basal/acetazolamide brain perfusion SPECT after MCA stent placement. This study showed that SPM and SPAM analyses of basal/acetazolamide Tc-99m brain SPECT could be used to evaluate the short-term hemodynamic efficacy of successful MCA stent placement. (orig.)
Directory of Open Access Journals (Sweden)
Ian T. Kracalik
2012-11-01
Full Text Available We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle and small (sheep and goats domestic ruminants across Kazakhstan. The Getis-Ord (Gi* statistic and a multidirectional optimal ecotope algorithm (AMOEBA were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149 and for small ruminants (n = 9. In contrast, Gi* revealed fewer large ruminant clusters (n = 122 and more small ruminant clusters (n = 61. Significant environmental differences were found between groups using the Kruskall-Wallis and Mann- Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.
Mihajilov-Krstev, Tatjana M; Denić, Marija S; Zlatković, Bojan K; Stankov-Jovanović, Vesna P; Mitić, Violeta D; Stojanović, Gordana S; Radulović, Niko S
2015-04-01
In Serbia, delicatessen fruit alcoholic drinks are produced from autochthonous fruit-bearing species such as cornelian cherry, blackberry, elderberry, wild strawberry, European wild apple, European blueberry and blackthorn fruits. There are no chemical data on many of these and herein we analysed volatile minor constituents of these rare fruit distillates. Our second goal was to determine possible chemical markers of these distillates through a statistical/multivariate treatment of the herein obtained and previously reported data. Detailed chemical analyses revealed a complex volatile profile of all studied fruit distillates with 371 identified compounds. A number of constituents were recognised as marker compounds for a particular distillate. Moreover, 33 of them represent newly detected flavour constituents in alcoholic beverages or, in general, in foodstuffs. With the aid of multivariate analyses, these volatile profiles were successfully exploited to infer the origin of raw materials used in the production of these spirits. It was also shown that all fruit distillates possessed weak antimicrobial properties. It seems that the aroma of these highly esteemed wild-fruit spirits depends on the subtle balance of various minor volatile compounds, whereby some of them are specific to a certain type of fruit distillate and enable their mutual distinction. © 2014 Society of Chemical Industry.
International Nuclear Information System (INIS)
Max Morris
2001-01-01
Recent advances in sensor technology and engineering have made it possible to assemble many related sensors in a common array, often of small physical size. Sensor arrays may report an entire vector of measured values in each data collection cycle, typically one value per sensor per sampling time. The larger quantities of data provided by larger arrays certainly contain more information, however in some cases experience suggests that dramatic increases in array size do not always lead to corresponding improvements in the practical value of the data. The work leading to this report was motivated by the need to develop computational planning tools to approximate the relative effectiveness of arrays of different size (or scale) in a wide variety of contexts. The basis of the work is a statistical model of a generic sensor array. It includes features representing measurement error, both common to all sensors and independent from sensor to sensor, and the stochastic relationships between the quantities to be measured by the sensors. The model can be used to assess the effectiveness of hypothetical arrays in classifying objects or events from two classes. A computer program is presented for evaluating the misclassification rates which can be expected when arrays are calibrated using a given number of training samples, or the number of training samples required to attain a given level of classification accuracy. The program is also available via email from the first author for a limited time
Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R
2016-12-01
: MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We
Kommers, Petrus A.M.; Smyrnova-Trybulska, Eugenia; Morze, Natalia; Issa, Tomayess; Issa, Theodora
2015-01-01
This paper, prepared by an international team of authors focuses on the conceptual aspects: analyses law, ethical, human, technical, social factors of ICT development, e-learning and intercultural development in different countries, setting out the previous and new theoretical model and preliminary
Directory of Open Access Journals (Sweden)
Michael Robert Cunningham
2016-10-01
Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.
Directory of Open Access Journals (Sweden)
Hiroo Hayashi
2009-01-01
Full Text Available The GPS radio occultation (RO soundings by the FORMOSAT-3/COSMIC (Taiwan¡¦s Formosa Satellite Misssion #3/Constellation Observing System for Meteorology, Ionosphere and Climate satellites launched in mid-April 2006 are compared with high-resolution balloon-borne (radiosonde and ozonesonde observations. This paper presents preliminary results of validation of the COSMIC RO measurements in terms of refractivity through the troposphere and lower stratosphere. With the use of COSMIC RO soundings within 2 hours and 300 km of sonde profiles, statistical comparisons between the collocated refractivity profiles are erformed for some tropical regions (Malaysia and Western Pacific islands where moisture-rich air is expected in the lower troposphere and for both northern and southern polar areas with a very dry troposphere. The results of the comparisons show good agreement between COSMIC RO and sonde refractivity rofiles throughout the troposphere (1 - 1.5% difference at most with a positive bias generally becoming larger at progressively higher altitudes in the lower stratosphere (1 - 2% difference around 25 km, and a very small standard deviation (about 0.5% or less for a few kilometers below the tropopause level. A large standard deviation of fractional differences in the lowermost troposphere, which reaches up to as much as 3.5 - 5%at 3 km, is seen in the tropics while a much smaller standard deviation (1 - 2% at most is evident throughout the polar troposphere.
Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman
2017-08-01
Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations
da Costa Lobato, Tarcísio; Hauser-Davis, Rachel Ann; de Oliveira, Terezinha Ferreira; Maciel, Marinalva Cardoso; Tavares, Maria Regina Madruga; da Silveira, Antônio Morais; Saraiva, Augusto Cesar Fonseca
2015-02-15
The Amazon area has been increasingly suffering from anthropogenic impacts, especially due to the construction of hydroelectric power plant reservoirs. The analysis and categorization of the trophic status of these reservoirs are of interest to indicate man-made changes in the environment. In this context, the present study aimed to categorize the trophic status of a hydroelectric power plant reservoir located in the Brazilian Amazon by constructing a novel Water Quality Index (WQI) and Trophic State Index (TSI) for the reservoir using major ion concentrations and physico-chemical water parameters determined in the area and taking into account the sampling locations and the local hydrological regimes. After applying statistical analyses (factor analysis and cluster analysis) and establishing a rule base of a fuzzy system to these indicators, the results obtained by the proposed method were then compared to the generally applied Carlson and a modified Lamparelli trophic state index (TSI), specific for trophic regions. The categorization of the trophic status by the proposed fuzzy method was shown to be more reliable, since it takes into account the specificities of the study area, while the Carlson and Lamparelli TSI do not, and, thus, tend to over or underestimate the trophic status of these ecosystems. The statistical techniques proposed and applied in the present study, are, therefore, relevant in cases of environmental management and policy decision-making processes, aiding in the identification of the ecological status of water bodies. With this, it is possible to identify which factors should be further investigated and/or adjusted in order to attempt the recovery of degraded water bodies. Copyright © 2014 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Todd C. Pataky
2016-11-01
Full Text Available One-dimensional (1D kinematic, force, and EMG trajectories are often analyzed using zero-dimensional (0D metrics like local extrema. Recently whole-trajectory 1D methods have emerged in the literature as alternatives. Since 0D and 1D methods can yield qualitatively different results, the two approaches may appear to be theoretically distinct. The purposes of this paper were (a to clarify that 0D and 1D approaches are actually just special cases of a more general region-of-interest (ROI analysis framework, and (b to demonstrate how ROIs can augment statistical power. We first simulated millions of smooth, random 1D datasets to validate theoretical predictions of the 0D, 1D and ROI approaches and to emphasize how ROIs provide a continuous bridge between 0D and 1D results. We then analyzed a variety of public datasets to demonstrate potential effects of ROIs on biomechanical conclusions. Results showed, first, that a priori ROI particulars can qualitatively affect the biomechanical conclusions that emerge from analyses and, second, that ROIs derived from exploratory/pilot analyses can detect smaller biomechanical effects than are detectable using full 1D methods. We recommend regarding ROIs, like data filtering particulars and Type I error rate, as parameters which can affect hypothesis testing results, and thus as sensitivity analysis tools to ensure arbitrary decisions do not influence scientific interpretations. Last, we describe open-source Python and MATLAB implementations of 1D ROI analysis for arbitrary experimental designs ranging from one-sample t tests to MANOVA.
Lind, Mads V; Savolainen, Otto I; Ross, Alastair B
2016-08-01
Data quality is critical for epidemiology, and as scientific understanding expands, the range of data available for epidemiological studies and the types of tools used for measurement have also expanded. It is essential for the epidemiologist to have a grasp of the issues involved with different measurement tools. One tool that is increasingly being used for measuring biomarkers in epidemiological cohorts is mass spectrometry (MS), because of the high specificity and sensitivity of MS-based methods and the expanding range of biomarkers that can be measured. Further, the ability of MS to quantify many biomarkers simultaneously is advantageously compared to single biomarker methods. However, as with all methods used to measure biomarkers, there are a number of pitfalls to consider which may have an impact on results when used in epidemiology. In this review we discuss the use of MS for biomarker analyses, focusing on metabolites and their application and potential issues related to large-scale epidemiology studies, the use of MS "omics" approaches for biomarker discovery and how MS-based results can be used for increasing biological knowledge gained from epidemiological studies. Better understanding of the possibilities and possible problems related to MS-based measurements will help the epidemiologist in their discussions with analytical chemists and lead to the use of the most appropriate statistical tools for these data.
Energy Technology Data Exchange (ETDEWEB)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu
2014-01-15
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.
International Nuclear Information System (INIS)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef
2014-01-01
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers
International Nuclear Information System (INIS)
Wenrich-Verbeek, K.J.; Suits, V.J.
1979-01-01
This report presents the chemical analyses and statistical evaluation of 62 water samples collected in the north-central part of New Mexico near Rio Ojo Caliente. Both spring and surface-water samples were taken throughout the Rio Ojo Caliente drainage basin above and a few miles below the town of La Madera. A high U concentration (15 μg/l) found in the water of the Rio Ojo Caliente near La Madera, Rio Arriba County, New Mexico, during a regional sampling-technique study in August 1975 by the senior author, was investigated further in May 1976 to determine whether stream waters could be effectively used to trace the source of a U anomaly. A detailed study of the tributaries to the Rio Ojo Caliente, involving 29 samples, was conducted during a moderate discharge period, May 1976, so that small tributaries would contain water. This study isolated Canada de la Cueva as the tributary contributing the anomalous U, so that in May 1977, an extremely low discharge period due to the 1977 drought, an additional 33 samples were taken to further define the anomalous area. 6 references, 3 figures, 6 tables
Energy Technology Data Exchange (ETDEWEB)
Bassant, Marie-Helene
1971-01-15
The aim of this work was to study the statistical properties of the amplitude of the electroencephalographic signal. The experimental method is described (implantation of electrodes, acquisition and treatment of data). The program of the mathematical analysis is given (calculation of probability density functions, study of stationarity) and the validity of the tests discussed. The results concerned ten rabbits. Trips of EEG were sampled during 40 s. with very short intervals (500 μs). The probability density functions established for different brain structures (especially the dorsal hippocampus) and areas, were compared during sleep, arousal and visual stimulus. Using a Χ{sup 2} test, it was found that the Gaussian distribution assumption was rejected in 96.7 per cent of the cases. For a given physiological state, there was no mathematical reason to reject the assumption of stationarity (in 96 per cent of the cases). (author) [French] Le but de ce travail est d'etudier les proprietes statistiques des amplitudes du signal electroencephalographique. La methode experimentale est decrite (implantation d'electrodes, acquisition et traitement des donnees). Le programme d'analyse mathematique est precise (calcul des courbes de repartition statistique, etude de la stationnarite du signal) et la validite des tests, discutee. Les resultats de l'etude portent sur 10 lapins. Des sequences de 40 s d'EEG sont echantillonnees. La valeur de la tension est prelevee a un pas d'echantillonnage de 500 μs. Les courbes de repartition statistiques sont comparees d'une region de l'encephale a l'autre (l'hippocampe dorsal a ete specialement analyse) ceci pendant le sommeil, l'eveil et des stimulations visuelles. Le test du Χ{sup 2} rejette l'hypothese de distribution normale dans 97 pour cent des cas. Pour un etat physiologique donne, il n'existe pas de raison mathematique a ce que soit repoussee l'hypothese de stationnarite, ceci dans 96.7 pour cent des cas. (auteur)
Ben Ayed, Rayda; Ennouri, Karim; Ercişli, Sezai; Ben Hlima, Hajer; Hanana, Mohsen; Smaoui, Slim; Rebai, Ahmed; Moreau, Fabienne
2018-04-10
Virgin olive oil is appreciated for its particular aroma and taste and is recognized worldwide for its nutritional value and health benefits. The olive oil contains a vast range of healthy compounds such as monounsaturated free fatty acids, especially, oleic acid. The SAD.1 polymorphism localized in the Stearoyl-acyl carrier protein desaturase gene (SAD) was genotyped and showed that it is associated with the oleic acid composition of olive oil samples. However, the effect of polymorphisms in fatty acid-related genes on olive oil monounsaturated and saturated fatty acids distribution in the Tunisian olive oil varieties is not understood. Seventeen Tunisian olive-tree varieties were selected for fatty acid content analysis by gas chromatography. The association of SAD.1 genotypes with the fatty acids composition was studied by statistical and Bayesian modeling analyses. Fatty acid content analysis showed interestingly that some Tunisian virgin olive oil varieties could be classified as a functional food and nutraceuticals due to their particular richness in oleic acid. In fact, the TT-SAD.1 genotype was found to be associated with a higher proportion of mono-unsaturated fatty acids (MUFA), mainly oleic acid (C18:1) (r = - 0.79, p SAD.1 association with the oleic acid composition of olive oil was identified among the studied varieties. This correlation fluctuated between studied varieties, which might elucidate variability in lipidic composition among them and therefore reflecting genetic diversity through differences in gene expression and biochemical pathways. SAD locus would represent an excellent marker for identifying interesting amongst virgin olive oil lipidic composition.
Shin, Yong Beom; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Jae Heung; Yeom, Seok-Ran
2006-06-01
Statistical parametric mapping (SPM) was applied to brain perfusion single photon emission computed tomography (SPECT) images in patients with traumatic brain injury (TBI) to investigate regional cerebral abnormalities compared to age-matched normal controls. Thirteen patients with TBI underwent brain perfusion SPECT were included in this study (10 males, three females, mean age 39.8 +/- 18.2, range 21 - 74). SPM2 software implemented in MATLAB 5.3 was used for spatial pre-processing and analysis and to determine the quantitative differences between TBI patients and age-matched normal controls. Three large voxel clusters of significantly decreased cerebral blood perfusion were found in patients with TBI. The largest clusters were area including medial frontal gyrus (voxel number 3642, peak Z-value = 4.31, 4.27, p = 0.000) in both hemispheres. The second largest clusters were areas including cingulated gyrus and anterior cingulate gyrus of left hemisphere (voxel number 381, peak Z-value = 3.67, 3.62, p = 0.000). Other clusters were parahippocampal gyrus (voxel number 173, peak Z-value = 3.40, p = 0.000) and hippocampus (voxel number 173, peak Z-value = 3.23, p = 0.001) in the left hemisphere. The false discovery rate (FDR) was less than 0.04. From this study, group and individual analyses of SPM2 could clearly identify the perfusion abnormalities of brain SPECT in patients with TBI. Group analysis of SPM2 showed hypoperfusion pattern in the areas including medial frontal gyrus of both hemispheres, cingulate gyrus, anterior cingulate gyrus, parahippocampal gyrus and hippocampus in the left hemisphere compared to age-matched normal controls. Also, left parahippocampal gyrus and left hippocampus were additional hypoperfusion areas. However, these findings deserve further investigation on a larger number of patients to be performed to allow a better validation of objective SPM analysis in patients with TBI.
Sullivan, Sharon G.; Barr, Catherine; Grabois, Andrew
2002-01-01
Includes six articles that report on prices of U.S. and foreign published materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and review media statistics. (LRW)
Sullivan, Sharon G.; Grabois, Andrew; Greco, Albert N.
2003-01-01
Includes six reports related to book trade statistics, including prices of U.S. and foreign materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and numbers of books and other media reviewed by major reviewing publications. (LRW)
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Directory of Open Access Journals (Sweden)
L. Bressan
2016-01-01
reconstructed sea level (RSL, the background slope (BS and the control function (CF. These functions are examined through a traditional spectral fast Fourier transform (FFT analysis and also through a statistical analysis, showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and RSL and the beta distribution (CF. As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.
NWS Weather Fatality, Injury and Damage Statistics
... Weather Awareness Floods, Wind Chill, Tornadoes, Heat... Education Weather Terms, Teachers, Statistics government web resources and services. Natural Hazard Statistics Statistics U.S. Summaries 78-Year List of Severe Weather Fatalities Preliminary Hazardous Weather Statistics for 2017 Now
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Directory of Open Access Journals (Sweden)
Susanne Unverzagt
Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health
DEFF Research Database (Denmark)
Edberg, Anna; Freyhult, Eva; Sand, Salomon
- and inter-national data excerpts. For example, major PCA loadings helped deciphering both shared and disparate features, relating to food groups, across Danish and Swedish preschool consumers. Data interrogation, reliant on the above-mentioned composite techniques, disclosed one outlier dietary prototype...... prototype with the latter property was identified also in the Danish data material, but without low consumption of Vegetables or Fruit & berries. The second MDA-type of data interrogation involved Supervised Learning, also known as Predictive Modelling. These exercises involved the Random Forest (RF...... not elaborated on in-depth, output from several analyses suggests a preference for energy-based consumption data for Cluster Analysis and Predictive Modelling, over those appearing as weight....
International Nuclear Information System (INIS)
El-Arabi, A.M.Abd El-Gabar M.; Khalifa, Ibrahim H.
2002-01-01
Factor and cluster analyses as well as the Pearson correlation coefficient have been applied to geochemical data obtained from phosphorite and phosphatic rocks of Duwi Formation exposed at the Red Sea coast, Nile Valley and Western Desert. Sixty-six samples from a total of 71 collected samples were analysed for SiO 2 , TiO 2 , Al 2 O 3 , Fe 2 O 3 , CaO, MgO, Na 2 O, K 2 O, P 2 O 5 , Sr, U and Pb by XRF and their mineral constituents were determined by the use of XRD techniques. In addition, the natural radioactivity of the phosphatic samples due to their uranium, thorium and potassium contents was measured by gamma-spectrometry.The uranium content in the phosphate rocks with P 2 O 5 >15% (average of 106.6 ppm) is higher than in rocks with P 2 O 5 2 O 5 and CaO, whereas it is not related to changes in SiO 2 , TiO 2 , Al 2 O 3 , Fe 2 O 3 , MgO, Na 2 O and K 2 O concentrations.Factor analysis and the Pearson correlation coefficient revealed that uranium behaves geochemically in different ways in the phosphatic sediments and phosphorites in the Red Sea, Nile Valley and Western Desert. In the Red Sea and Western Desert phosphorites, uranium occurs mainly in oxidized U 6+ state where it seems to be fixed by the phosphate ion, forming secondary uranium phosphate minerals such as phosphuranylite.In the Nile Valley phosphorites, ionic substitution of Ca 2+ by U 4+ is the main controlling factor in the concentration of uranium in phosphate rocks. Moreover, fixation of U 6+ by phosphate ion and adsorption of uranium on phosphate minerals play subordinate roles
Directory of Open Access Journals (Sweden)
C. Hoareau
2012-06-01
Full Text Available A ground-based Rayleigh lidar has provided continuous observations of tropospheric water vapour profiles and cirrus cloud using a preliminary Raman channels setup on an existing Rayleigh lidar above La Reunion over the period 2002–2005. With this instrument, we performed a first measurement campaign of 350 independent water vapour profiles. A statistical study of the distribution of water vapour profiles is presented and some investigations concerning the calibration are discussed. Analysis regarding the cirrus clouds is presented and a classification has been performed showing 3 distinct classes. Based on these results, the characteristics and the design of a future lidar system, to be implemented at the new Reunion Island altitude observatory (2200 m for long-term monitoring, is presented and numerical simulations of system performance have been realised to compare both instruments.
International Nuclear Information System (INIS)
Ridone, S.; Arginelli, D.; Bortoluzzi, S.; Canuto, G.; Montalto, M.; Nocente, M.; Vegro, M.
2006-01-01
Many biological samples (urines and faeces) have been analysed by means of chromatographic extraction columns, utilising two different resins (AG 1-X2 resin chloride and T.R.U.), in order to detect the possible internal contamination of 239 + 240 Pu and 241 Am, for some workers of a reprocessing nuclear plant in the decommissioning phase. The results obtained show on one hand the great suitability of the first resin for the determination of plutonium, and on the other the great selectivity of the second one for the determination of americium
International Nuclear Information System (INIS)
Roach, J.F.
1992-01-01
The electrical insulation system of the SSC long dipole magnets is reviewed and potential dielectric failure modes discussed. Electrical insulation fabrication and assembly issues with respect to rate production manufacturability are addressed. The automation required for rate assembly of electrical insulation components will require critical online visual and dielectric screening tests to insure production quality. Storage and assembly areas must bc designed to prevent foreign particles from becoming entrapped in the insulation during critical coil winding, molding, and collaring operations. All hand assembly procedures involving dielectrics must be performed with rigorous attention to their impact on insulation integrity. Individual dipole magnets must have a sufficiently low probability of electrical insulation failure under all normal and fault mode voltage conditions such that the series of magnets in the SSC rings have acceptable Mean Time Between Failure (MTBF) with respect to dielectric mode failure events. Statistical models appropriate for large electrical system breakdown failure analysis are applied to the SSC magnet rings. The MTBF of the SSC system is related to failure data base for individual dipole magnet samples
Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis
2016-07-01
A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
1993-08-01
Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to the EPA`s Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Additional information about the 1992 PA is provided in other volumes. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions, the choice of parameters selected for sampling, and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect compliance with 40 CFR 191B are: drilling intensity, intrusion borehole permeability, halite and anhydrite permeabilities, radionuclide solubilities and distribution coefficients, fracture spacing in the Culebra Dolomite Member of the Rustler Formation, porosity of the Culebra, and spatial variability of Culebra transmissivity. Performance with respect to 40 CFR 191B is insensitive to uncertainty in other parameters; however, additional data are needed to confirm that reality lies within the assigned distributions.
International Nuclear Information System (INIS)
Gilbert, R.O.; Klover, W.J.
1988-09-01
Radiation detection surveys are used at the US Department of Energy's Hanford Reservation near Richland, Washington, to determine areas that need posting as radiation zones or to measure dose rates in the field. The relationship between measurements made by Sodium Iodide (NaI) detectors mounted on the mobile Road Monitor vehicle and those made by hand-held GM P-11 probes and Micro-R meters are of particular interest because the Road Monitor can survey land areas in much less time than hand-held detectors. Statistical regression methods are used here to develop simple equations to predict GM P-11 probe gross gamma count-per-minute (cpm) and Micro-R-Meter μR/h measurements on the basis of NaI gross gamma count-per-second (cps) measurements obtained using the Road Monitor. These equations were estimated using data collected near the 116-K-2 Trench in the 100-K area on the Hanford Reservation. Equations are also obtained for estimating upper and lower limits within which the GM P-11 or Micro-R-Meter measurement corresponding to a given NaI Road Monitor measurement at a new location is expected to fall with high probability. An equation and limits for predicting GM P-11 measurements on the basis of Micro-R- Meter measurements is also estimated. Also, we estimate an equation that may be useful for approximating the 90 Sr measurement of a surface soil sample on the basis of a spectroscopy measurement for 137 Cs on that sample. 3 refs., 16 figs., 44 tabs
Szostek, Krzysztof; Głab, Henryk; Pudło, Aleksandra
2009-01-01
Barium and strontium analyses yield an important perspective on temporal shifts in diet in relation to social and environmental circumstances. This research focuses on reconstructing dietary strategies of individuals in the early medieval (12-13th century) population of Gdańsk on the coast of the Baltic Sea. To describe these strategies where seafood rich in minerals was included in the diet, levels of strontium, barium, calcium and phosphorus were measured in first permanent molars of adult men and women whose remains were excavated from the churchyard in the city centre. Faunal remains from the excavation were analysed as an environmental background with respect to the content of the above-mentioned elements. The impact of diagenesis on the odontological material under study was also determined by an analysis of the soil derived from the grave and non-grave surroundings. For verification of diagenetic processes, the calcium/phosphorus index was used. Strontium, calcium, phosphorus and barium levels were determined with the spectrophotometric method using the latest generation plasma spectrophotometer Elan 6100 ICP-MS. From the results of the analysis of taphonomic parameters such as the soil pH, potential ion exchange in the grave surroundings and the Ca/P ratio, it can be inferred that diagenetic factors had little impact on the studied material. From this pilot study we can conclude that in the early Middle Ages in the Baltic Sea basin, seafood was included in the diet from early childhood and at the same time the diet contained calcium-rich milk products (also rich in minerals). The lack of sex differences may indicate the absence of a sex-specific nutritional strategy in childhood and early adolescence.
National Center for Educational Statistics (DHEW/OE), Washington, DC.
In response to needs expressed by the community of higher education institutions, the National Center for Educational Statistics has produced early estimates of a selected group of mean salaries of instructional faculty in institutions of higher education in 1972-73. The number and salaries of male and female instructional staff by rank are of…
International Nuclear Information System (INIS)
Aoyama, Takashi; Ishizaka, Masatsuna; Yamamoto, Yoichi; Kano, Eiichi; Nikaido, Osamu.
1981-01-01
The Japan Association of Radiologic Technologists reported that, from 1941 to 1978, 395 deaths occurred among Japanese radiological technologists who belong to the association. Using these data, Sakka, Kitabatake and colleagues, and the present authors studied mortality and cause of death among these technologists for 11 years from 1955 to 1965, for 7 years from 1966 to 1972, and for 5 years from 1973 to 1977, respectively. In general, the number of cancer deaths in the three studies was less than expected. However, Kitabatake et al. and the present authors found that deaths from skin cancer were significantly more frequent than expected. The present authors recently estimated the cumulative doses of radiation exposure for the majority of deaths (268 out of 395). The mean dose of radiation related to cancer deaths was then compared with that for non-cancer deaths. Also the proportional mortality ratios for cancers were observed in relation to the estimated dose level. In the present study, however, statistical tests to assess for the relationship between mortality and dose of radiation exposure showed no correlation, for the majority of deaths from cancer. (author)
Liu, Yang; Chiaromonte, Francesca; Ross, Howard; Malhotra, Raunaq; Elleder, Daniel; Poss, Mary
2015-06-30
in G to A substitutions, but found no evidence for this host defense strategy. Our error correction approach for minor allele frequencies (more sensitive and computationally efficient than other algorithms) and our statistical treatment of variation (ANOVA) were critical for effective use of high-throughput sequencing data in understanding viral diversity. We found that co-infection with PLV shifts FIV diversity from bone marrow to lymph node and spleen.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Franklin, Daniel; Cardini, Andrea; Flavel, Ambika; Kuliukas, Algis
2012-07-01
A current limitation of forensic practice in Western Australia is a lack of contemporary population-specific standards for biological profiling; this directly relates to the unavailability of documented human skeletal collections. With rapidly advancing technology, however, it is now possible to acquire accurate skeletal measurements from 3D scans contained in medical databases. The purpose of the present study, therefore, is to explore the accuracy of using cranial form to predict sex in adult Australians. Both traditional and geometric morphometric methods are applied to data derived from 3D landmarks acquired in CT-reconstructed crania. The sample comprises multi-detector computed tomography scans of 200 adult individuals; following 3D volume rendering, 46 anatomical landmarks are acquired using OsiriX (version 3.9). Centroid size and shape (first 20 PCs of the Procrustes coordinates) and the inter-landmark (ILD) distances between all possible pairs of landmarks are then calculated. Sex classification effectiveness of the 3D multivariate descriptors of size and shape and selected ILD measurements are assessed and compared; robustness of findings is explored using resampling statistics. Cranial shape and size and the ILD measurements are sexually dimorphic and explain 3.2 to 54.3 % of sample variance; sex classification accuracy is 83.5-88.0 %. Sex estimation using 3D shape appears to have some advantages compared to approaches using size measurements. We have, however, identified a simple and biologically meaningful single non-traditional linear measurement (glabella-zygion) that classifies Western Australian individuals according to sex with a high degree of expected accuracy (87.5-88 %).
Sieverink, Floor; Kelders, Saskia M; Braakman-Jansen, Louise M A; van Gemert-Pijnen, Julia E W C
2014-03-01
The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. © 2014 Diabetes Technology Society.
Directory of Open Access Journals (Sweden)
Justyna Mojsa-Kaja
2016-12-01
Full Text Available Objectives: Obsessive-compulsive disorder (OCD is an anxiety-spectrum disorder that affects 1–2% of the adult population. People with OCD are more likely to report impaired social and occupational functioning. Although effective treatments of the OCD exist, many sufferers from this disorder are continuously misdiagnosed. Therefore, improving the assessment of the OCD remains an important area of scientific research. The main goal of the study is the initial verification of psychometric properties in the Polish version of the Obsessive-Compulsive Inventory-Revised (OCI-R in a college student sample. Material and Methods: A group of students completed a battery of measures consisting of obsessive-compulsive symptoms (The OCI-R, The Yale-Brown Obsessive Compulsive Scale, depression (The Beck Depression Inventory and anxiety trait (The State-Trait Anxiety Inventory. Results: A confirmatory factor analysis, conducted on data from 334 university students, supported a solid and replicable 6-fold factor structure of the OCI-R. Further analyses on test-retest reliability (following a 1-month interval, convergent and divergent validity of the OCI-R were respectively conducted in a group of 137 students who had completed a battery of measures mentioned above. The results showed adequate testretest reliability for the full scale and subscales cores, high internal consistency and confirmed satisfactory convergent and divergent validity. Conclusions: The study constitutes the first phase of work on a Polish version of measurement for obsessive-compulsive symptoms. Satisfactory results obtained in a non-clinical sample allow to recognize this method to be promising for further research. Int J Occup Med Environ Health 2016;29(6:1011–1021
Butterworth, Anna L.; Westphal, Andrew J.; Frank, David R.; Allen, Carlton C.; Bechtel, Hans A.; Sandford, Scott A.; Tsou, Peter; Zolensky, Michael E.
2014-01-01
We report the quantitative characterization by synchrotron soft X-ray spectroscopy of 31 potential impact features in the aerogel capture medium of the Stardust Interstellar Dust Collector. Samples were analyzed in aerogel by acquiring high spatial resolution maps and high energy-resolution spectra of major rock-forming elements Mg, Al, Si, Fe, and others. We developed diagnostic screening tests to reject spacecraft secondary ejecta and terrestrial contaminants from further consideration as interstellar dust candidates. The results support an extraterrestrial origin for three interstellar candidates: I1043,1,30 (Orion) is a 3 pg particle with Mg-spinel, forsterite, and an iron-bearing phase. I1047,1,34 (Hylabrook) is a 4 pg particle comprising an olivine core surrounded by low-density, amorphous Mg-silicate and amorphous Fe, Cr, and Mn phases. I1003,1,40 (Sorok) has the track morphology of a high-speed impact, but contains no detectable residue that is convincingly distinguishable from the background aerogel. Twenty-two samples with an anthropogenic origin were rejected, including four secondary ejecta from impacts on the Stardust spacecraft aft solar panels, nine ejecta from secondary impacts on the Stardust Sample Return Capsule, and nine contaminants lacking evidence of an impact. Other samples in the collection included I1029,1,6, which contained surviving solar system impactor material. Four samples remained ambiguous: I1006,2,18, I1044,2,32, and I1092,2,38 were too dense for analysis, and we did not detect an intact projectile in I1044,3,33. We detected no radiation effects from the synchrotron soft X-ray analyses; however, we recorded the effects of synchrotron hard X-ray radiation on I1043,1,30 and I1047,1,34.
Statistical analysis of lightning electric field measured under Malaysian condition
Salimi, Behnam; Mehranzamir, Kamyar; Abdul-Malek, Zulkurnain
2014-02-01
Lightning is an electrical discharge during thunderstorms that can be either within clouds (Inter-Cloud), or between clouds and ground (Cloud-Ground). The Lightning characteristics and their statistical information are the foundation for the design of lightning protection system as well as for the calculation of lightning radiated fields. Nowadays, there are various techniques to detect lightning signals and to determine various parameters produced by a lightning flash. Each technique provides its own claimed performances. In this paper, the characteristics of captured broadband electric fields generated by cloud-to-ground lightning discharges in South of Malaysia are analyzed. A total of 130 cloud-to-ground lightning flashes from 3 separate thunderstorm events (each event lasts for about 4-5 hours) were examined. Statistical analyses of the following signal parameters were presented: preliminary breakdown pulse train time duration, time interval between preliminary breakdowns and return stroke, multiplicity of stroke, and percentages of single stroke only. The BIL model is also introduced to characterize the lightning signature patterns. Observations on the statistical analyses show that about 79% of lightning signals fit well with the BIL model. The maximum and minimum of preliminary breakdown time duration of the observed lightning signals are 84 ms and 560 us, respectively. The findings of the statistical results show that 7.6% of the flashes were single stroke flashes, and the maximum number of strokes recorded was 14 multiple strokes per flash. A preliminary breakdown signature in more than 95% of the flashes can be identified.
International Nuclear Information System (INIS)
Fatmi, H.; Ababou, R.; Matray, J.M.; Joly, C.
2010-01-01
Document available in extended abstract form only. This paper presents methods of statistical analysis and interpretation of hydrogeological signals in clayey formations, e.g., pore water pressure and atmospheric pressure. The purpose of these analyses is to characterize the hydraulic behaviour of this type of formation in the case of a deep repository of Mid- Level/High-Level and Long-lived radioactive wastes, and to study the evolution of the geologic formation and its EDZ (Excavation Damaged Zone) during the excavation of galleries. We focus on galleries Ga98 and Ga03 in the sites of Mont Terri (Jura, Switzerland) and Tournemire (France, Aveyron), through data collected in the BPP- 1 and PH2 boreholes, respectively. The Mont Terri site, crossing the Aalenian Opalinus clay-stone, is an underground laboratory managed by an international consortium, namely the Mont Terri project (Switzerland). The Tournemire site, crossing the Toarcian clay-stone, is an Underground Research facility managed by IRSN (France). We have analysed pore water and atmospheric pressure signals at these sites, sometimes in correlation with other data. The methods of analysis are based on the theory of stationary random signals (correlation functions, Fourier spectra, transfer functions, envelopes), and on multi-resolution wavelet analysis (adapted to nonstationary and evolutionary signals). These methods are also combined with filtering techniques, and they can be used for single signals as well as pairs of signals (cross-analyses). The objective of this work is to exploit pressure measurements in selected boreholes from the two compacted clay sites, in order to: - evaluate phenomena affecting the measurements (earth tides, barometric pressures..); - estimate hydraulic properties (specific storage..) of the clay-stones prior to excavation works and compare them with those estimated by pulse or slug tests on shorter time scales; - analyze the effects of drift excavation on pore pressures
International Nuclear Information System (INIS)
Suzuki, Nobuhiro; Yamazaki, Yasuo; Fujimoto, Zui; Morita, Takashi; Mizuno, Hiroshi
2005-01-01
Crystals of pseudechetoxin and pseudecin, potent peptidic inhibitors of cyclic nucleotide-gated ion channels, have been prepared and X-ray diffraction data have been collected to 2.25 and 1.90 Å resolution, respectively. Cyclic nucleotide-gated (CNG) ion channels play pivotal roles in sensory transduction of retinal and olfactory neurons. The elapid snake toxins pseudechetoxin (PsTx) and pseudecin (Pdc) are the only known protein blockers of CNG channels. These toxins are structurally classified as cysteine-rich secretory proteins and exhibit structural features that are quite distinct from those of other known small peptidic channel blockers. This article describes the crystallization and preliminary X-ray diffraction analyses of these toxins. Crystals of PsTx belonged to space group P2 1 2 1 2 1 , with unit-cell parameters a = 60.30, b = 61.59, c = 251.69 Å, and diffraction data were collected to 2.25 Å resolution. Crystals of Pdc also belonged to space group P2 1 2 1 2 1 , with similar unit-cell parameters a = 60.71, b = 61.67, c = 251.22 Å, and diffraction data were collected to 1.90 Å resolution
Directory of Open Access Journals (Sweden)
Olivier Buchsenschutz
2009-05-01
Full Text Available Le développement des systèmes d'information géographique (SIG permet d'introduire dans les bases de données archéologiques la localisation des données. Il est possible alors d'obtenir des cartes de répartition qu'il s'agit ensuite d'interpréter en s’appuyant sur des analyses statistiques et spatiales. Cartes et statistiques mettent en évidence l'état de la recherche, les conditions de conservation des sites, et au-delà des phénomènes historiques ou culturels.À travers un programme de recherche sur l'âge du Fer en France (Basefer une base de données globale a été constituée pour l'espace métropolitain. Cet article propose un certain nombre d'analyses sur les critères descriptifs généraux d’un corpus de 11 000 sites (les départements côtiers de la Méditerranée ne sont pas traités dans ce test. Le contrôle et le développement des rubriques plus fines seront réalisés avec une équipe élargie, avant une mise en réseau de la base.The development of Geographical Information Systems (GIS allows information in archaeological databases to be georeferenced. It is thus possible to obtain distribution maps which can then be interpreted using statistical and spatial analyses. Maps and statistics highlight the state of research, the condition of sites, and moreover historical and cultural phenomena.Through a research programme on the Iron Age in France (Basefer, a global database was established for the entire country. This article puts forward some analyses of the general descriptive criteria represented in a corpus of 11000 sites (departments along the Mediterranean Sea coast are excluded from this test. The control and development of finer descriptors will be undertaken by an enlarged team, before the data are networked.
DEFF Research Database (Denmark)
Buus, Niels; Gonge, Henrik
2013-01-01
for the translation of the MCSS from English into Danish and to present a preliminary psychometric validation of the Danish version of the scale. Methods included a formal translation/back-translation procedure and statistical analyses. The sample consisted of MCSS scores from 139 Danish mental health nursing staff...
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Energy Technology Data Exchange (ETDEWEB)
Pirson, A.S.; George, J.; Krug, B.; Vander Borght, T. [Universite Catholique de Louvain, Service de Medecine Nucleaire, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); Van Laere, K. [Leuven Univ. Hospital, Nuclear Medicine Div. (Belgium); Jamart, J. [Universite Catholique de Louvain, Dept. de Biostatistiques, Cliniques Universitaires de Mont-Godinne, Yvoir (Belgium); D' Asseler, Y. [Ghent Univ., Medical Signal and Image Processing Dept. (MEDISIP), Faculty of applied sciences (Belgium); Minoshima, S. [Washington Univ., Dept. of Radiology, Seattle (United States)
2009-10-15
Fully automated analysis programs have been applied more and more to aid for the reading of regional cerebral blood flow SPECT study. They are increasingly based on the comparison of the patient study with a normal database. In this study, we evaluate the ability of Three-Dimensional Stereotactic Surface Projection (3 D-S.S.P.) to isolate effects of age and gender in a previously studied normal population. The results were also compared with those obtained using Statistical Parametric Mapping (S.P.M.99). Methods Eighty-nine {sup 99m}Tc-E.C.D.-SPECT studies performed in carefully screened healthy volunteers (46 females, 43 males; age 20 - 81 years) were analysed using 3 D-S.S.P.. A multivariate analysis based on the general linear model was performed with regions as intra-subject factor, gender as inter-subject factor and age as co-variate. Results Both age and gender had a significant interaction effect with regional tracer uptake. An age-related decline (p < 0.001) was found in the anterior cingulate gyrus, left frontal association cortex and left insula. Bilateral occipital association and left primary visual cortical uptake showed a significant relative increase with age (p < 0.001). Concerning the gender effect, women showed higher uptake (p < 0.01) in the parietal and right sensorimotor cortices. An age by gender interaction (p < 0.01) was only found in the left medial frontal cortex. The results were consistent with those obtained with S.P.M.99. Conclusion 3 D-S.S.P. analysis of normal r.C.B.F. variability is consistent with the literature and other automated voxel-based techniques, which highlight the effects of both age and gender. (authors)
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Directory of Open Access Journals (Sweden)
Eliana Freire Gaspar de Carvalho Dores
2001-02-01
Full Text Available A preliminary analyses of the possible contamination of superficial and underground water by the active ingredients of the pesticide products used in the surroundings of the urban area of Primavera do Leste, Mato Grosso, Brazil, was carried out. A description of the study region and of its environmental characteristics, which can favor the contamination of the local aquatic environment, was presented. The EPA screening criteria, the groundwater ubiquity score (GUS and the criteria proposed by Goss were used to evaluate which pesticides might contaminate the local waters. Among the active ingredients studied, several present risks to the local aquatic environment.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Kohler, Helmut
The purpose of this study was to analyze the available statistics concerning teachers in schools of general education in the Federal Republic of Germany. An analysis of the demographic structure of the pool of full-time teachers showed that in 1971 30 percent of the teachers were under age 30, and 50 percent were under age 35. It was expected that…
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Hoffmeyer, P.
Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...
Authentication by Keystroke Timing: Some Preliminary Results
National Research Council Canada - National Science Library
Gaines, R. S; Lisowski, William; Press, S. J; Shapiro, Norman
1980-01-01
... of an individual seeking access to the computer. This report summarizes preliminary efforts to establish whether an individual can be identified by the statistical characteristics of his or her typing...
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Notices about using elementary statistics in psychology
松田, 文子; 三宅, 幹子; 橋本, 優花里; 山崎, 理央; 森田, 愛子; 小嶋, 佳子
2003-01-01
Improper uses of elementary statistics that were often observed in beginners' manuscripts and papers were collected and better ways were suggested. This paper consists of three parts: About descriptive statistics, multivariate analyses, and statistical tests.
Does environmental data collection need statistics?
Pulles, M.P.J.
1998-01-01
The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija
2015-07-15
Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.
Statistical learning and prejudice.
Madison, Guy; Ullén, Fredrik
2012-12-01
Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Energy Technology Data Exchange (ETDEWEB)
Ridone, S.; Arginelli, D.; Bortoluzzi, S.; Canuto, G.; Montalto, M.; Nocente, M.; Vegro, M. [Italian National Agency for New Technologies, Energy and the Environment (ENEA), Research Centre of Saluggia, Radiation Protection Institute, Saluggia, VC (Italy)
2006-07-01
Many biological samples (urines and faeces) have been analysed by means of chromatographic extraction columns, utilising two different resins (AG 1-X2 resin chloride and T.R.U.), in order to detect the possible internal contamination of {sup 239{sup +}}{sup 240}Pu and {sup 241}Am, for some workers of a reprocessing nuclear plant in the decommissioning phase. The results obtained show on one hand the great suitability of the first resin for the determination of plutonium, and on the other the great selectivity of the second one for the determination of americium.
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Energy Technology Data Exchange (ETDEWEB)
Grieser, J.; Staeger, T.; Schoenwiese, C.D.
2000-03-01
The report answers the question where, why and how different climate variables have changed within the last 100 years. The analyzed variables are observed time series of temperature (mean, maximum, minimum), precipitation, air pressure, and water vapour pressure in a monthly resolution. The time series are given as station data and grid box data as well. Two kinds of time-series analysis are performed. The first is applied to find significant changes concerning mean and variance of the time series. Thereby also changes in the annual cycle and frequency of extreme events arise. The second approach is used to detect significant spatio-temporal patterns in the variations of climate variables, which are most likely driven by known natural and anthropogenic climate forcings. Furtheron, an estimation of climate noise allows to indicate regions where certain climate variables have changed significantly due to the enhanced anthropogenic greenhouse effect. (orig.) [German] Der Bericht gibt Antwort auf die Frage, wo sich welche Klimavariable wie und warum veraendert hat. Ausgangspunkt der Analyse sind huntertjaehrige Zeitreihen der Temperatur (Mittel, Maximum, Minimum), des Niederschlags, Luftdrucks und Wasserdampfpartialdrucks in monatlicher Aufloesung. Es wurden sowohl Stationsdaten als auch Gitterpunktdaten verwendet. Mit Hilfe der strukturorientierten Zeitreihenzerlegung wurden signifikankte Aenderungen im Mittel und in der Varianz der Zeitreihen gefunden. Diese betreffen auch Aenderungen im Jahresgang und in der Haeufigkeit extremer Ereignisse. Die ursachenorientierte Zeitreihenzerlegung selektiert signifikante raumzeitliche Variationen der Klimavariablen, die natuerlichen bzw. anthropogenen Klimaantrieben zugeordnet werden koennen. Eine Abschaetzung des Klimarauschens erlaubt darueber hinaus anzugeben, wo und wie signifikant der anthropogene Treibhauseffekt welche Klimavariablen veraendert hat. (orig.)
Directory of Open Access Journals (Sweden)
Handan YOLSAL
2012-06-01
Full Text Available Bu makalenin amacı zaman serileri için resmi istatistik ajansları tarafından geliştirilen ve çok yaygın olarak uygulanan mevsim düzeltme programlarını tanıtmaktır. Bu programlar iki ana grupta sınıflanmaktadır. Bunlardan biri, ilk defa olarak NBER tarafından geliştirilen ve hareketli ortalamalar filtreleri kullanan CENSUS II X-11 ailesidir. Bu aile X-11 ARIMA ve X-12 ARIMA tekniklerini içerir. Diğeri ise İspanya Merkez Bankası tarafından geliştirilen ve model bazlı bir yaklaşım olan TRAMO/SEATS programıdır. Bu makalede sözü edilen tekniklerin mevsimsel ayrıştırma süreçleri, bu tekniklerin içerdiği ticari gün, takvim etkisi gibi bazı özel etkiler, avantaj ve dezavantajları ve ayrıca öngörü performansları tartışılacaktır.-This paper’s aim is to introduce most commonly applied seasonal adjustment programs improved by official statistical agencies for the time series. These programs are classified in two main groups. One of them is the family of CENSUS II X-11 which was using moving average filters and was first developed by NBER. This family involves X-11 ARIMA and X-12 ARIMA techniques. The other one is TRAMO/SEATS program which was a model based approach and has been developed by Spain Central Bank. The seasonal decomposition procedures of these techniques which are mentioned before and consisting of some special effects such as trading day, calendar effects and their advantages-disadvantages and also forecasting performances of them will be discussed in this paper.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Statistical analysis of management data
Gatignon, Hubert
2013-01-01
This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.
The quality improvement attitude survey: Development and preliminary psychometric characteristics.
Dunagan, Pamela B
2017-12-01
To report the development of a tool to measure nurse's attitudes about quality improvement in their practice setting and to examine preliminary psychometric characteristics of the Quality Improvement Nursing Attitude Scale. Human factors such as nursing attitudes of complacency have been identified as root causes of sentinel events. Attitudes of nurses concerning use of Quality and Safety Education for nurse's competencies can be most challenging to teach and to change. No tool has been developed measuring attitudes of nurses concerning their role in quality improvement. A descriptive study design with preliminary psychometric evaluation was used to examine the preliminary psychometric characteristics of the Quality Improvement Nursing Attitude Scale. Registered bedside clinical nurses comprised the sample for the study (n = 57). Quantitative data were analysed using descriptive statistics and Cronbach's alpha reliability. Total score and individual item statistics were evaluated. Two open-ended items were used to collect statements about nurses' feelings regarding their experience in quality improvement efforts. Strong support for the internal consistency reliability and face validity of the Quality Improvement Nursing Attitude Scale was found. Total scale scores were high indicating nurse participants valued Quality and Safety Education for Nurse competencies in practice. However, item-level statistics indicated nurses felt powerless when other nurses deviate from care standards. Additionally, the sample indicated they did not consistently report patient safety issues and did not have a feeling of value in efforts to improve care. Findings suggested organisational culture fosters nurses' reporting safety issues and feeling valued in efforts to improve care. Participants' narrative comments and item analysis revealed the need to generate new items for the Quality Improvement Nursing Attitude Scale focused on nurses' perception of their importance in quality and
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
SWORDS: A statistical tool for analysing large DNA sequences
Indian Academy of Sciences (India)
Unknown
These techniques are based on frequency distributions of DNA words in a large sequence, and have been packaged into a software called SWORDS. Using sequences available in ... tions with the cellular processes like recombination, replication .... in DNA sequences using certain specific probability laws. (Pevzner et al ...
Statistical methods for analysing responses of wildlife to human disturbance.
Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom
2006-01-01
1. Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To illustrate these methods, we studied the...
Statistical analyses of local transport coefficients in Ohmic ASDEX discharges
International Nuclear Information System (INIS)
Simmet, E.; Stroth, U.; Wagner, F.; Fahrbach, H.U.; Herrmann, W.; Kardaun, O.J.W.F.; Mayer, H.M.
1991-01-01
Tokamak energy transport is still an unsolved problem. Many theoretical models have been developed, which try to explain the anomalous high energy-transport coefficients. Up to now these models have been applied to global plasma parameters. A comparison of transport coefficients with global confinement time is only conclusive if the transport is dominated by one process across the plasma diameter. This, however, is not the case in most Ohmic confinement regimes, where at least three different transport mechanisms play an important role. Sawtooth activity leads to an increase in energy transport in the plasma centre. In the intermediate region turbulent transport is expected. Candidates here are drift waves and resistive fluid turbulences. At the edge, ballooning modes or rippling modes could dominate the transport. For the intermediate region, one can deduce theoretical scaling laws for τ E from turbulent theories. Predicted scalings reproduce the experimentally found density dependence of τ E in the linear Ohmic confinement regime (LOC) and the saturated regime (SOC), but they do not show the correct dependence on the isotope mass. The relevance of these transport theories can only be tested in comparing them to experimental local transport coefficients. To this purpose we have performed transport calculations on more than a hundred Ohmic ASDEX discharges. By Principal Component Analysis we determine the dimensionless components which dominate the transport coefficients and we compare the results to the predictions of various theories. (author) 6 refs., 2 figs., 1 tab
Statistical considerations for grain-size analyses of tills
Jacobs, A.M.
1971-01-01
Relative percentages of sand, silt, and clay from samples of the same till unit are not identical because of different lithologies in the source areas, sorting in transport, random variation, and experimental error. Random variation and experimental error can be isolated from the other two as follows. For each particle-size class of each till unit, a standard population is determined by using a normally distributed, representative group of data. New measurements are compared with the standard population and, if they compare satisfactorily, the experimental error is not significant and random variation is within the expected range for the population. The outcome of the comparison depends on numerical criteria derived from a graphical method rather than on a more commonly used one-way analysis of variance with two treatments. If the number of samples and the standard deviation of the standard population are substituted in a t-test equation, a family of hyperbolas is generated, each of which corresponds to a specific number of subsamples taken from each new sample. The axes of the graphs of the hyperbolas are the standard deviation of new measurements (horizontal axis) and the difference between the means of the new measurements and the standard population (vertical axis). The area between the two branches of each hyperbola corresponds to a satisfactory comparison between the new measurements and the standard population. Measurements from a new sample can be tested by plotting their standard deviation vs. difference in means on axes containing a hyperbola corresponding to the specific number of subsamples used. If the point lies between the branches of the hyperbola, the measurements are considered reliable. But if the point lies outside this region, the measurements are repeated. Because the critical segment of the hyperbola is approximately a straight line parallel to the horizontal axis, the test is simplified to a comparison between the means of the standard population and the means of the subsample. The minimum number of subsamples required to prove significant variation between samples caused by different lithologies in the source areas and sorting in transport can be determined directly from the graphical method. The minimum number of subsamples required is the maximum number to be run for economy of effort. ?? 1971 Plenum Publishing Corporation.
Practical Statistics for Particle Physics Analyses: Likelihoods (1/4)
CERN. Geneva; Lyons, Louis
2016-01-01
This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....
Late neolithic pottery standardization: Application of statistical analyses
Directory of Open Access Journals (Sweden)
Vuković Jasna
2011-01-01
Full Text Available This paper defines the notion of standardization, presents the methodological approach to analysis, points to the problems and limitation arising in examination of materials from archaeological excavations, and presents the results of the analysis of coefficients of variation of metric parameters of the Late Neolithic vessels recovered at the sites of Vinča and Motel Slatina. [Projekat Ministarstva nauke Republike Srbije, br. 177012: Society, the spiritual and material culture and communications in prehistory and early history of the Balkans
Statistical and regression analyses of detected extrasolar systems
Czech Academy of Sciences Publication Activity Database
Pintr, Pavel; Peřinová, V.; Lukš, A.; Pathak, A.
2013-01-01
Roč. 75, č. 1 (2013), s. 37-45 ISSN 0032-0633 Institutional support: RVO:61389021 Keywords : Exoplanets * Kepler candidates * Regression analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.630, year: 2013 http://www.sciencedirect.com/science/article/pii/S0032063312003066
Preliminary Multiphysics Analyses of HFIR LEU Fuel Conversion using COMSOL
Energy Technology Data Exchange (ETDEWEB)
Freels, James D [ORNL; Bodey, Isaac T [ORNL; Arimilli, Rao V [ORNL; Curtis, Franklin G [ORNL; Ekici, Kivanc [ORNL; Jain, Prashant K [ORNL
2011-06-01
The research documented herein was performed by several individuals across multiple organizations. We have previously acknowledged our funding for the project, but another common thread among the authors of this document, and hence the research performed, is the analysis tool COMSOL. The research has been divided into categories to allow the COMSOL analysis to be performed independently to the extent possible. As will be seen herein, the research has progressed to the point where it is expected that next year (2011) a large fraction of the research will require collaboration of our efforts as we progress almost exclusively into three-dimensional (3D) analysis. To the extent possible, we have tried to segregate the development effort into two-dimensional (2D) analysis in order to arrive at techniques and methodology that can be extended to 3D models in a timely manner. The Research Reactors Division (RRD) of ORNL has contracted with the University of Tennessee, Knoxville (UTK) Mechanical, Aerospace and Biomedical Engineering Department (MABE) to perform a significant fraction of this research. This group has been chosen due to their expertise and long-term commitment in using COMSOL and also because the participating students are able to work onsite on a part-time basis due to the close proximity of UTK with the ORNL campus. The UTK research has been governed by a statement of work (SOW) which clearly defines the specific tasks reported herein on the perspective areas of research. Ph.D. student Isaac T. Bodey has focused on heat transfer, fluid flow, modeling, and meshing issues and has been aided by his major professor Dr. Rao V. Arimilli and is the primary contributor to Section 2 of this report. Ph.D student Franklin G. Curtis has been focusing exclusively on fluid-structure interaction (FSI) due to the mechanical forces acting on the plate caused by the flow and has also been aided by his major professor Dr. Kivanc Ekici and is the primary contributor to Section 4 of this report. The HFIR LEU conversion project has also obtained the services of Dr. Prashant K. Jain of the Reactor & Nuclear Systems Division (RNSD) of ORNL. Prashant has quickly adapted to the COMSOL tools and has been focusing on thermal-structure interaction (TSI) issues and development of alternative 3D model approaches that could yield faster-running solutions. Prashant is the primary contributor to Section 5 of the report. And finally, while incorporating findings from all members of the COMSOL team (i.e., the team) and contributing as the senior COMSOL leader and advocate, Dr. James D. Freels has focused on the 3D model development, cluster deployment, and has contributed primarily to Section 3 and overall integration of this report. The team has migrated to the current release of COMSOL at version 4.1 for all the work described in this report, except where stated otherwise. Just as in the performance of the research, each of the respective sections has been originally authored by the respective authors. Therefore, the reader will observe a contrast in writing style throughout this document.
Preliminary Analyses of Beidou Signal-In Anomaly Since 2013
Wu, Y.; Ren, J.; Liu, W.
2016-06-01
As BeiDou navigation system has been operational since December 2012. There is an increasing desire to use multiple constellation to improve positioning performance. The signal-in-space (SIS) anomaly caused by the ground control and the space vehicle is one of the major threats to affect the integrity. For a young Global Navigation Satellite System, knowledge about SIS anomalies in history is very important for not only assessing the SIS integrity performance of a constellation but also providing the assumption for ARAIM (Advanced Receiver Autonomous Integrity Monitoring). In this paper, the broadcast ephemerides and the precise ones are pre-processed for avoiding the false anomaly identification. The SIS errors over the period of Mar. 2013-Feb. 2016 are computed by comparing the broadcast ephemerides with the precise ones. The time offsets between GPST (GPS time) and BDT (BeiDou time) are estimated and removed by an improved estimation algorithm. SIS worst-UREs are computed and a RMS criteria are investigated to identify the SIS anomalies. The results show that the probability of BeiDou SIS anomalies is in 10-3 level in last three years. Even though BeiDou SIS integrity performance currently cannot match the GPS integrity performances, the result indicates that BeiDou has a tendency to improve its integrity performance.
Preliminary Monthly Climatological Summaries
National Oceanic and Atmospheric Administration, Department of Commerce — Preliminary Local Climatological Data, recorded since 1970 on Weather Burean Form 1030 and then National Weather Service Form F-6. The preliminary climate data pages...
Statistical methods for astronomical data analysis
Chattopadhyay, Asis Kumar
2014-01-01
This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...
Multivariate statistical characterization of groundwater quality in Ain ...
African Journals Online (AJOL)
Administrator
depends much on the sustainability of the available water resources. Water of .... 18 wells currently in use were selected based on the preliminary field survey carried out to ... In recent times, multivariate statistical methods have been applied ...
Methodology in robust and nonparametric statistics
Jurecková, Jana; Picek, Jan
2012-01-01
Introduction and SynopsisIntroductionSynopsisPreliminariesIntroductionInference in Linear ModelsRobustness ConceptsRobust and Minimax Estimation of LocationClippings from Probability and Asymptotic TheoryProblemsRobust Estimation of Location and RegressionIntroductionM-EstimatorsL-EstimatorsR-EstimatorsMinimum Distance and Pitman EstimatorsDifferentiable Statistical FunctionsProblemsAsymptotic Representations for L-Estimators
Lies, damn lies and statistics
International Nuclear Information System (INIS)
Jones, M.D.
2001-01-01
Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab
Preliminary In Vivo Experiments on Adhesion of Geckos
Lepore, E.; Brianza, S.; Antoniolli, F.; Buono, M.; Carpinteri, A.; Pugno, N.
2008-01-01
We performed preliminary experiments on the adhesion of a Tokay gecko on surfaces with different roughness, with or without particles with significant different granulometry, before/after or during the moult. The results were analyzed using the Weibull statistics.
[Statistics for statistics?--Thoughts about psychological tools].
Berger, Uwe; Stöbel-Richter, Yve
2007-12-01
Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.
Statistics for Learning Genetics
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Statistical analysis and data management
International Nuclear Information System (INIS)
Anon.
1981-01-01
This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Entropy statistics and information theory
Frenken, K.; Hanusch, H.; Pyka, A.
2007-01-01
Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Preliminary ECLSS waste water model
Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.
1991-01-01
A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.
Fundamental data analyses for measurement control
International Nuclear Information System (INIS)
Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.
1987-02-01
A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs
Thiese, Matthew S; Walker, Skyler; Lindsey, Jenna
2017-10-01
Distribution of valuable research discoveries are needed for the continual advancement of patient care. Publication and subsequent reliance of false study results would be detrimental for patient care. Unfortunately, research misconduct may originate from many sources. While there is evidence of ongoing research misconduct in all it's forms, it is challenging to identify the actual occurrence of research misconduct, which is especially true for misconduct in clinical trials. Research misconduct is challenging to measure and there are few studies reporting the prevalence or underlying causes of research misconduct among biomedical researchers. Reported prevalence estimates of misconduct are probably underestimates, and range from 0.3% to 4.9%. There have been efforts to measure the prevalence of research misconduct; however, the relatively few published studies are not freely comparable because of varying characterizations of research misconduct and the methods used for data collection. There are some signs which may point to an increased possibility of research misconduct, however there is a need for continued self-policing by biomedical researchers. There are existing resources to assist in ensuring appropriate statistical methods and preventing other types of research fraud. These included the "Statistical Analyses and Methods in the Published Literature", also known as the SAMPL guidelines, which help scientists determine the appropriate method of reporting various statistical methods; the "Strengthening Analytical Thinking for Observational Studies", or the STRATOS, which emphases on execution and interpretation of results; and the Committee on Publication Ethics (COPE), which was created in 1997 to deliver guidance about publication ethics. COPE has a sequence of views and strategies grounded in the values of honesty and accuracy.
Isotopic safeguards statistics
International Nuclear Information System (INIS)
Timmerman, C.L.; Stewart, K.B.
1978-06-01
The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
R statistical application development by example : beginner's guide
Tattar, Narayanachart Prabhanjan
2013-01-01
Full of screenshots and examples, this Beginner's Guide by Example will teach you practically everything you need to know about R statistical application development from scratch. You will begin learning the first concepts of statistics in R which is vital in this fast paced era and it is also a bargain as you do not need to do a preliminary course on the subject.
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Statistics for NAEG: past efforts, new results, and future plans
International Nuclear Information System (INIS)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.; Engel, D.W.
1983-06-01
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Usage statistics and demonstrator services
CERN. Geneva
2007-01-01
An understanding of the use of repositories and their contents is clearly desirable for authors and repository managers alike, as well as those who are analysing the state of scholarly communications. A number of individual initiatives have produced statistics of variious kinds for individual repositories, but the real challenge is to produce statistics that can be collected and compared transparently on a global scale. This presentation details the steps to be taken to address the issues to attain this capability View Les Carr's biography
Statistical methods in spatial genetics
DEFF Research Database (Denmark)
Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie
2009-01-01
The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
Directory of Open Access Journals (Sweden)
Brady T West
Full Text Available Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT, which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Practical Statistics for Environmental and Biological Scientists
Townend, John
2012-01-01
All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
Preliminary results from NOAMP deep drifting floats
International Nuclear Information System (INIS)
Ollitrault, M.
1989-01-01
This paper is a very brief and preliminary outline of first results obtained with deep SOFAR floats in the NOAMP area. The work is now going toward more precise statistical estimations of mean and variable currents, together with better tracking to resolve submesoscales and estimate diffusivities due to mesoscale and smaller scale motions. However the preliminary results confirm that the NOAMP region (and surroundings) has a deep mesoscale eddy field that is considerably more energetic that the mean field (r.m.s. velocities are of order 5 cm s -1 ), although both values are diminished compared to the western basin. A data report containing trajectories and statistics is scheduled to be published by IFREMER in the near future. The project main task is to especially study the dispersion of radioactive substances
Modern applied statistics with S-plus
Venables, W N
1994-01-01
S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Statistical analysis of environmental data
International Nuclear Information System (INIS)
Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.
1975-10-01
This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described
Statistical Literacy in the Data Science Workplace
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
2013-02-28
... identify and resolve issues involved in the preliminary analyses. Chapter 2 of the preliminary technical... DOE conducted in-depth technical analyses in the following areas for GSFLs and IRLs currently under... also begun work on the manufacturer impact analysis and identified the methods to be used for the LCC...
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
First-Generation Transgenic Plants and Statistics
Nap, Jan-Peter; Keizer, Paul; Jansen, Ritsert
1993-01-01
The statistical analyses of populations of first-generation transgenic plants are commonly based on mean and variance and generally require a test of normality. Since in many cases the assumptions of normality are not met, analyses can result in erroneous conclusions. Transformation of data to
47 CFR 1.363 - Introduction of statistical data.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall be...
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Preliminary design report for the NAC combined transport cask
International Nuclear Information System (INIS)
1990-04-01
Nuclear Assurance Corporation (NAC) is under contract to the United States Department of Energy (DOE) to design, license, develop and test models, and fabricate a prototype cask transportation system for nuclear spent fuel. The design of this combined transport (rail/barge) transportation system has been divided into two phases, a preliminary design phase and a final design phase. This Preliminary Design Package (PDP) describes the NAC Combined Transport Cask (NAC-CTC), the results of work completed during the preliminary design phase and identifies the additional detailed analyses, which will be performed during final design. Preliminary analytical results are presented in the appropriate sections and supplemented by summaries of procedures and assumptions for performing the additional detailed analyses of the final design. 60 refs., 1 fig., 2 tabs
Multivariate statistical methods a first course
Marcoulides, George A
2014-01-01
Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin
The disagreeable behaviour of the kappa statistic.
Flight, Laura; Julious, Steven A
2015-01-01
It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Breast cancer statistics, 2011.
DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin
2011-01-01
In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population. Copyright © 2011 American Cancer Society, Inc.
Preliminary conceptual design and analysis on KALIMER reactor structures
International Nuclear Information System (INIS)
Kim, Jong Bum
1996-10-01
The objectives of this study are to perform preliminary conceptual design and structural analyses for KALIMER (Korea Advanced Liquid Metal Reactor) reactor structures to assess the design feasibility and to identify detailed analysis requirements. KALIMER thermal hydraulic system analysis results and neutronic analysis results are not available at present, only-limited preliminary structural analyses have been performed with the assumptions on the thermal loads. The responses of reactor vessel and reactor internal structures were based on the temperature difference of core inlet and outlet and on engineering judgments. Thermal stresses from the assumed temperatures were calculated using ANSYS code through parametric finite element heat transfer and elastic stress analyses. While, based on the results of preliminary conceptual design and structural analyses, the ASME Code limits for the reactor structures were satisfied for the pressure boundary, the needs for inelastic analyses were indicated for evaluation of design adequacy of the support barrel and the thermal liner. To reduce thermal striping effects in the bottom are of UIS due to up-flowing sodium form reactor core, installation of Inconel-718 liner to the bottom area was proposed, and to mitigate thermal shock loads, additional stainless steel liner was also suggested. The design feasibilities of these were validated through simplified preliminary analyses. In conceptual design phase, the implementation of these results will be made for the design of the reactor structures and the reactor internal structures in conjunction with the thermal hydraulic, neutronic, and seismic analyses results. 4 tabs., 24 figs., 4 refs. (Author)
DEFF Research Database (Denmark)
Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove
2007-01-01
the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...
Contesting Citizenship: Comparative Analyses
DEFF Research Database (Denmark)
Siim, Birte; Squires, Judith
2007-01-01
importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Risico-analyse brandstofpontons
Uijt de Haag P; Post J; LSO
2001-01-01
Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een
Energy Technology Data Exchange (ETDEWEB)
Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.
1982-10-01
A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.
International Nuclear Information System (INIS)
Berry, A.; Przybylski, M.M.; Sumner, I.
1982-01-01
A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
A Statistical Analysis of Cryptocurrencies
Directory of Open Access Journals (Sweden)
Stephen Chan
2017-05-01
Full Text Available We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Indian Academy of Sciences (India)
IAS Admin
Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
International Nuclear Information System (INIS)
2003-01-01
The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Temperature dependent anomalous statistics
International Nuclear Information System (INIS)
Das, A.; Panda, S.
1991-07-01
We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs
Integrating Statistical Visualization Research into the Political Science Classroom
Draper, Geoffrey M.; Liu, Baodong; Riesenfeld, Richard F.
2011-01-01
The use of computer software to facilitate learning in political science courses is well established. However, the statistical software packages used in many political science courses can be difficult to use and counter-intuitive. We describe the results of a preliminary user study suggesting that visually-oriented analysis software can help…
Statistical studies on quasars and active nuclei of galaxies
International Nuclear Information System (INIS)
Stasinska, G.
1987-01-01
A catalogue of optical, radio and X-ray properties of quasars and other active galactic nuclei, now in elaboration, is presented. This catalogue may serve as a data base for statistical studies. As an example, we give some preliminary results concerning the determination of the quasar masses [fr
International Nuclear Information System (INIS)
Geiser, Achim
2015-12-01
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.
Energy Technology Data Exchange (ETDEWEB)
Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies
1996-12-31
The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)
Statistical criteria for characterizing irradiance time series.
Energy Technology Data Exchange (ETDEWEB)
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Southard, Rodney E.
2013-01-01
estimates on one of these streams can be calculated at an ungaged location that has a drainage area that is between 40 percent of the drainage area of the farthest upstream streamgage and within 150 percent of the drainage area of the farthest downstream streamgage along the stream of interest. The second method may be used on any stream with a streamgage that has operated for 10 years or longer and for which anthropogenic effects have not changed the low-flow characteristics at the ungaged location since collection of the streamflow data. A ratio of drainage area of the stream at the ungaged location to the drainage area of the stream at the streamgage was computed to estimate the statistic at the ungaged location. The range of applicability is between 40- and 150-percent of the drainage area of the streamgage, and the ungaged location must be located on the same stream as the streamgage. The third method uses regional regression equations to estimate selected low-flow frequency statistics for unregulated streams in Missouri. This report presents regression equations to estimate frequency statistics for the 10-year recurrence interval and for the N-day durations of 1, 2, 3, 7, 10, 30, and 60 days. Basin and climatic characteristics were computed using geographic information system software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses based on existing digital geospatial data and previous studies. Spatial analyses for geographical bias in the predictive accuracy of the regional regression equations defined three low-flow regions with the State representing the three major physiographic provinces in Missouri. Region 1 includes the Central Lowlands, Region 2 includes the Ozark Plateaus, and Region 3 includes the Mississippi Alluvial Plain. A total of 207 streamgages were used in the regression analyses for the regional equations. Of the 207 U.S. Geological Survey streamgages, 77 were
Preliminary hazards analysis -- vitrification process
International Nuclear Information System (INIS)
Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.
1994-06-01
This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment
Preliminary hazards analysis -- vitrification process
Energy Technology Data Exchange (ETDEWEB)
Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)
1994-06-01
This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.
International Nuclear Information System (INIS)
Gunbey, Hediye Pinar; Bilgici, Meltem Ceyhan; Aslan, Kerim; Incesu, Lutfi; Has, Arzu Ceylan; Ogur, Methiye Gonul; Alhan, Aslihan
2017-01-01
To provide an initial assessment of white matter (WM) integrity with diffusion tensor imaging (DTI) and the accompanying volumetric changes in WM and grey matter (GM) through volumetric analyses of young children with Down's syndrome (DS). Ten children with DS and eight healthy control subjects were included in the study. Tract-based spatial statistics (TBSS) were used in the DTI study for whole-brain voxelwise analysis of fractional anisotropy (FA) and mean diffusivity (MD) of WM. Volumetric analyses were performed with an automated segmentation method to obtain regional measurements of cortical volumes. Children with DS showed significantly reduced FA in association tracts of the fronto-temporo-occipital regions as well as the corpus callosum (CC) and anterior limb of the internal capsule (p < 0.05). Volumetric reductions included total cortical GM, cerebellar GM and WM volume, basal ganglia, thalamus, brainstem and CC in DS compared with controls (p < 0.05). These preliminary results suggest that DTI and volumetric analyses may reflect the earliest complementary changes of the neurodevelopmental delay in children with DS and can serve as surrogate biomarkers of the specific elements of WM and GM integrity for cognitive development. (orig.)
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
UVISS preliminary visibility analysis
DEFF Research Database (Denmark)
Betto, Maurizio
1998-01-01
The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...... of the work is also to setup the kernel of a software tool for the visibility analysis thatshould be easily expandable to consider more complex strucures for future activities.This analysis is part of the UVISS assessment study and it is meant to provide elementsfor the definition and the selection...
Preliminary In Vivo Experiments on Adhesion of Geckos
Directory of Open Access Journals (Sweden)
E. Lepore
2008-01-01
Full Text Available We performed preliminary experiments on the adhesion of a Tokay gecko on surfaces with different roughness, with or without particles with significant different granulometry, before/after or during the moult. The results were analyzed using the Weibull statistics.
Preliminary Test for Nonlinear Input Output Relations in SISO Systems
DEFF Research Database (Denmark)
Knudsen, Torben
2000-01-01
This paper discusses and develops preliminary statistical tests for detecting nonlinearities in the deterministic part of SISO systems with noise. The most referenced method is unreliable for common noise processes as e.g.\\ colored. Therefore two new methods based on superposition and sinus input...
Preliminary Multi-Variable Parametric Cost Model for Space Telescopes
Stahl, H. Philip; Hendrichs, Todd
2010-01-01
This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.
Energy Technology Data Exchange (ETDEWEB)
Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division
1998-03-01
The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)
International Nuclear Information System (INIS)
Lawson, E.M.
1998-01-01
The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)
Antares: preliminary demonstrator results
International Nuclear Information System (INIS)
Kouchner, A.
2000-05-01
The ANTARES collaboration is building an undersea neutrino telescope off Toulon (Mediterranean sea) with effective area ∼ 0.1 km 2 . An extensive study of the site properties has been achieved together with software analysis in order to optimize the performance of the detector. Results are summarized here. An instrumented line, linked to shore for first time via an electro-optical cable, has been immersed late 1999. The preliminary results of this demonstrator line are reported. (author)
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Mineral statistics yearbook 1994
International Nuclear Information System (INIS)
1994-01-01
A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
Licensing Support System: Preliminary data scope analysis
International Nuclear Information System (INIS)
1989-01-01
The purpose of this analysis is to determine the content and scope of the Licensing Support System (LSS) data base. Both user needs and currently available data bases that, at least in part, address those needs have been analyzed. This analysis, together with the Preliminary Needs Analysis (DOE, 1988d) is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. These reports are preliminary. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This document provides a baseline for what is known at this time. Additional analyses, currently being conducted, will provide more precise information on the content and scope of the LSS data base. 23 refs., 4 figs., 8 tabs
Safety performance of preliminary KALIMER conceptual design
Energy Technology Data Exchange (ETDEWEB)
Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong [Korea atomic Energy Resarch Inst., Taejon (Korea)
1999-07-01
The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)
Safety performance of preliminary KALIMER conceptual design
International Nuclear Information System (INIS)
Hahn Dohee; Kim Kyoungdoo; Kwon Youngmin; Chang Wonpyo; Suk Soodong
1999-01-01
The Korea Atomic Energy Research Institute (KAERI) is developing KALIMER (Korea Advanced Liquid Metal Reactor), which is a sodium cooled, 150 MWe pool-type reactor. The safety design of KALIMER emphasizes accident prevention by using passive processes, which can be accomplished by the safety design objectives including the utilization of inherent safety features. In order to assess the effectiveness of the inherent safety features in achieving the safety design objectives, a preliminary evaluation of ATWS performance for the KALIMER design has been performed with SSC-K code, which is a modified version of SSC-L code. KAERI's modification of the code includes development of reactivity feedback models for the core and a pool model for KALIMER reactor vessel. This paper describes the models for control rod driveline expansion, gas expansion module and the thermal hydraulic model for reactor pool and the results of preliminary analyses for unprotected loss of flow and loss o heat sink. (author)
Cancer Data and Statistics Tools
... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...
International Nuclear Information System (INIS)
Takeda, Tatsuoki
1985-01-01
In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)
Reproducible statistical analysis with multiple languages
DEFF Research Database (Denmark)
Lenth, Russell; Højsgaard, Søren
2011-01-01
This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....
Preliminary cost estimating for the nuclear industry
International Nuclear Information System (INIS)
Klumpar, I.V.; Soltz, K.M.
1985-01-01
The nuclear industry has higher costs for personnel, equipment, construction, and engineering than conventional industry, which means that cost estimation procedures may need adjustment. The authors account for the special technical and labor requirements of the nuclear industry in making adjustments to equipment and installation cost estimations. Using illustrative examples, they show that conventional methods of preliminary cost estimation are flexible enough for application to emerging industries if their cost structure is similar to that of the process industries. If not, modifications can provide enough engineering and cost data for a statistical analysis. 9 references, 14 figures, 4 tables
Uncertainty Analyses and Strategy
International Nuclear Information System (INIS)
Kevin Coppersmith
2001-01-01
The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
Department of Veterans Affairs — National-level, VISN-level, and/or VAMC-level statistics on the numbers and percentages of users of VHA care form the Northeast Program Evaluation Center (NEPEC)....
International Nuclear Information System (INIS)
Hilaire, S.
2001-01-01
A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...
Statistical Measures of Marksmanship
National Research Council Canada - National Science Library
Johnson, Richard
2001-01-01
.... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...
Titanic: A Statistical Exploration.
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
Davison, Anthony C.; Huser, Raphaë l
2015-01-01
Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
CSIR Research Space (South Africa)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina
2015-01-01
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Illinois travel statistics, 2008
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2009
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Cholesterol Facts and Statistics
... Managing High Cholesterol Cholesterol-lowering Medicine High Cholesterol Statistics and Maps High Cholesterol Facts High Cholesterol Maps ... Deo R, et al. Heart disease and stroke statistics—2017 update: a report from the American Heart ...
Illinois travel statistics, 2010
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...