WorldWideScience

Sample records for high statistics high

  1. Statistics for High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed.  Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells,  Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...

  2. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  3. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  4. High impact  =  high statistical standards? Not necessarily so.

    Science.gov (United States)

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  5. High Impact = High Statistical Standards? Not Necessarily So

    Science.gov (United States)

    Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533

  6. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  7. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  8. A Statistical Perspective on Highly Accelerated Testing

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Edward V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning

  9. Statistical evidences of absorption at high latitudes

    International Nuclear Information System (INIS)

    Fesenko, B.I.

    1980-01-01

    Evidences are considered which indicate to the significant effect of the irregular interstellar absorption at high latitudes b. The number density of faint galaxies grows with the increasing |b| even at the values of |b| exceeding 50 deg. The effects of interstellar medium are traced even in the directions of the stars and globular clusters with very low values of the colour excess. The coefficient of absorption, Asub(B)=0.29+-0.05, was estimated from the colours of the bright E-galaxies [ru

  10. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  11. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  12. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  13. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  14. High energy behaviour of particles and unified statistics

    International Nuclear Information System (INIS)

    Chang, Y.

    1984-01-01

    Theories and experiments suggest that particles at high energy appear to possess a new statistics unifying Bose-Einstein and Fermi-Dirac statistics via the GAMMA distribution. This hypothesis can be obtained from many models, and agrees quantitatively with scaling, the multiplicty, large transverse momentum, the mass spectrum, and other data. It may be applied to scatterings at high energy, and agrees with experiments and known QED's results. The Veneziano model and other theories have implied new statistics, such as, the B distribution and the Polya distribution. They revert to the GAMMA distribution at high energy. The possible inapplicability of Pauli's exclusion principle within the unified statistics is considered and associated to the quark constituents

  15. Statistical behavior of high doses in medical radiodiagnosis

    International Nuclear Information System (INIS)

    Barboza, Adriana Elisa

    2014-01-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  16. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  17. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  18. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  19. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  20. High-Throughput Nanoindentation for Statistical and Spatial Property Determination

    Science.gov (United States)

    Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.

    2018-04-01

    Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.

  1. Topics in statistical data analysis for high-energy physics

    International Nuclear Information System (INIS)

    Cowan, G.

    2011-01-01

    These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)

  2. Statistical approach for calculating opacities of high-Z plasmas

    International Nuclear Information System (INIS)

    Nishikawa, Takeshi; Nakamura, Shinji; Takabe, Hideaki; Mima, Kunioki

    1992-01-01

    For simulating the X-ray radiation from laser produced high-Z plasma, an appropriate atomic modeling is necessary. Based on the average ion model, we have used a rather simple atomic model for opacity calculation in a hydrodynamic code and obtained a fairly good agreement with the experiment on the X-ray spectra from the laser-produced plasmas. We have investigated the accuracy of the atomic model used in the hydrodynamic code. It is found that transition energies of 4p-4d, 4d-4f, 4p-5d, 4d-5f and 4f-5g, which are important in laser produced high-Z plasma, can be given within an error of 15 % compared to the values by the Hartree-Fock-Slater (HFS) calculation and their oscillator strengths obtained by HFS calculation vary by a factor two according to the difference of charge state. We also propose a statistical method to carry out detail configuration accounting for electronic state by use of the population of bound electrons calculated with the average ion model. The statistical method is relatively simple and provides much improvement in calculating spectral opacities of line radiation, when we use the average ion model to determine electronic state. (author)

  3. Statistical mechanics of high-density bond percolation

    Science.gov (United States)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  4. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    Statistical inference for sparse signals or low-rank matrices in high-dimensional settings is of significant interest in a range of contemporary applications. It has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. In this thesis, we consider several problems in including sparse signal recovery (compressed sensing under restricted isometry) and low-rank matrix recovery (matrix recovery via rank-one projections and structured matrix completion). The first part of the thesis discusses compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and low-rank matrix recovery. The analysis relies on a key technical tool which represents points in a polytope by convex combinations of sparse vectors. The technique is elementary while leads to sharp results. It is shown that, in compressed sensing, delta kA 0, delta kA < 1/3 + epsilon, deltak A + thetak,kA < 1 + epsilon, or deltatkA< √(t - 1) / t + epsilon are not sufficient to guarantee the exact recovery of all k-sparse signals for large k. Similar result also holds for matrix recovery. In addition, the conditions delta kA<1/3, deltak A+ thetak,kA<1, delta tkA < √(t - 1)/t and deltarM<1/3, delta rM+ thetar,rM<1, delta trM< √(t - 1)/ t are also shown to be sufficient respectively for stable recovery of approximately sparse signals and low-rank matrices in the noisy case. For the second part of the thesis, we introduce a rank-one projection model for low-rank matrix recovery and propose a constrained nuclear norm minimization method for stable recovery of low-rank matrices in the noisy case. The procedure is adaptive to the rank and robust against small perturbations. Both upper and lower bounds for the estimation accuracy under the Frobenius norm loss are obtained. The proposed estimator is shown to be rate-optimal under certain conditions. The

  5. A High School Statistics Class Investigates the Death Penalty

    Science.gov (United States)

    Brelias, Anastasia

    2015-01-01

    Recommendations for reforming high school mathematics curricula emphasize the importance of engaging students in mathematical investigations of societal issues (CCSSI [Common Core State Standards Initiative] 2010; NCTM [National Council of Teachers of Mathematics] 2000). Proponents argue that these investigations can positively influence students'…

  6. Statistical mechanics of flux lines in high-temperature superconductors

    International Nuclear Information System (INIS)

    Dasgupta, C.

    1992-01-01

    The shortness of the low temperature coherence lengths of high T c materials leads to new mechanisms of pinning of flux lines. Lattice periodic modulations of the order parameters itself acts to pin vortex lines in regions of the unit cell were the order parameter is small. A presentation of flux creep and flux noise at low temperature and magnetic fields in terms of motion of simple metastable defects on flux lines is made, with a calculation of flux lattice melting. 12 refs

  7. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  8. Statistical and direct decay of high-lying single-particle excitations

    International Nuclear Information System (INIS)

    Gales, S.

    1993-01-01

    Transfer reactions induced by hadronic probes at intermediate energies have revealed a rich spectrum of high-lying excitations embedded in the nuclear continuum. The investigation of their decay properties is believed to be a severe test of their microscopic structure as predicted by microscopic nuclear models. In addition the degree of damping of these simple modes in the nuclear continuum can be obtained by means of the measured particle (n,p) decay branching ratios. The neutron and proton decay studies of high-lying single-particle states in heavy nuclei are presented. (author). 13 refs., 9 figs

  9. Statistical emission of complex fragments from highly excited compound nucleus

    International Nuclear Information System (INIS)

    Matsuse, T.

    1991-01-01

    A full statistical analysis has been given in terms of the Extended Hauser-Feshbach method. The charge and kinetic energy distributions of 35 Cl+ 12 C reaction at E lab = 180, 200 MeV and 23 Na+ 24 Mg reaction at E lab = 89 MeV which form the 47 V compound nucleus are investigated as a prototype of the light mass system. The measured kinetic energy distributions of the complex fragments are shown to be well reproduced by the Extended Hauser-Feshbach method, so the observed complex fragment production is understood as the statistical binary decay from the compound nucleus induced by heavy-ion reaction. Next, this method is applied to the study of the complex production from the 111 In compound nucleus which is formed by the 84 Kr+ 27 Al reaction at E lab = 890 MeV. (K.A.) 18 refs., 10 figs

  10. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  11. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  12. High statistics inclusive phi-meson production at SPS energies

    International Nuclear Information System (INIS)

    Dijkstra, H.B.

    1985-01-01

    This thesis describes an experiment studying the inclusive reaction hadron + Be → phi + anything → K + + K - + anything in 100 GeV/c, 120 GeV/c and 200 GeV/c hadron interactions. A total of 8x10 6 events were recorded using both positively and negatively charged unseparated hadron beams supplied by the CERN SPS. The experiment made use of an intelligent on-line event selection system based on micro-processors (FAMPs) in conjunction with a system of large MWPCs to increase the number of phi-events recorded per unit time. In 32 days of data taking over 600,000 phi-mesons were recorded onto magnetic tape. The physics motivation for collecting a large statistics sample of inclusive phi-mesons was the investigation of the inclusive phi-meson production mechanism and phi-spectroscopy. (Auth.)

  13. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  14. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  15. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    International Nuclear Information System (INIS)

    Adams, T.; Batra, P.; Bugel, Leonard G.; Camilleri, Leslie Loris; Conrad, Janet Marie; Fisher, Peter H.; Formaggio, Joseph Angelo; Karagiorgi, Georgia S.; )

    2009-01-01

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDFs). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parameterized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of 'Beyond the Standard Model' physics

  16. Eulerian and Lagrangian statistics from high resolution numerical simulations of weakly compressible turbulence

    NARCIS (Netherlands)

    Benzi, R.; Biferale, L.; Fisher, R.T.; Lamb, D.Q.; Toschi, F.

    2009-01-01

    We report a detailed study of Eulerian and Lagrangian statistics from high resolution Direct Numerical Simulations of isotropic weakly compressible turbulence. Reynolds number at the Taylor microscale is estimated to be around 600. Eulerian and Lagrangian statistics is evaluated over a huge data

  17. Statistical issues in searches for new phenomena in High Energy Physics

    Science.gov (United States)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  18. High performance statistical computing with parallel R: applications to biology and climate modelling

    International Nuclear Information System (INIS)

    Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth

    2006-01-01

    Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines

  19. Statistical behavior of high doses in medical radiodiagnosis; Comportamento estatistico das altas doses em radiodiagnostico medico

    Energy Technology Data Exchange (ETDEWEB)

    Barboza, Adriana Elisa, E-mail: adrianaebarboza@gmail.com, E-mail: elisa@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  20. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    Science.gov (United States)

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  1. Statistics of high-altitude and high-latitude O+ ion outflows observed by Cluster/CIS

    Directory of Open Access Journals (Sweden)

    A. Korth

    2005-07-01

    Full Text Available The persistent outflows of O+ ions observed by the Cluster CIS/CODIF instrument were studied statistically in the high-altitude (from 3 up to 11 RE and high-latitude (from 70 to ~90 deg invariant latitude, ILAT polar region. The principal results are: (1 Outflowing O+ ions with more than 1keV are observed above 10 RE geocentric distance and above 85deg ILAT location; (2 at 6-8 RE geocentric distance, the latitudinal distribution of O+ ion outflow is consistent with velocity filter dispersion from a source equatorward and below the spacecraft (e.g. the cusp/cleft; (3 however, at 8-12 RE geocentric distance the distribution of O+ outflows cannot be explained by velocity filter only. The results suggest that additional energization or acceleration processes for outflowing O+ ions occur at high altitudes and high latitudes in the dayside polar region. Keywords. Magnetospheric physics (Magnetospheric configuration and dynamics, Solar wind-magnetosphere interactions

  2. Statistical modeling in phenomenological description of electromagnetic cascade processes produced by high-energy gamma quanta

    International Nuclear Information System (INIS)

    Slowinski, B.

    1987-01-01

    A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)

  3. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  4. New method for eliminating the statistical bias in highly turbulent flow measurements

    International Nuclear Information System (INIS)

    Nakao, S.I.; Terao, Y.; Hirata, K.I.; Kitakyushu Industrial Research Institute, Fukuoka, Japan)

    1987-01-01

    A simple method was developed for eliminating statistical bias which can be applied to highly turbulent flows with the sparse and nonuniform seeding conditions. Unlike the method proposed so far, a weighting function was determined based on the idea that the statistical bias could be eliminated if the asymmetric form of the probability density function of the velocity data were corrected. Moreover, the data more than three standard deviations away from the mean were discarded to remove the apparent turbulent intensity resulting from noise. The present method was applied to data obtained in the wake of a block, which provided local turbulent intensities up to about 120 percent, it was found to eliminate the statistical bias with high accuracy. 9 references

  5. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  6. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya [Jadavpur Univ., Calcutta (India)

    1994-07-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by [sup 16]O at 2.1 GeV/nucleon and [sup 12]C and [sup 24]Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author).

  7. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya

    1994-01-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by 16 O at 2.1 GeV/nucleon and 12 C and 24 Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author)

  8. Excel 2016 in applied statistics for high school students a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2018-01-01

    This textbook is a step-by-step guide for high school, community college, or undergraduate students who are taking a course in applied statistics and wish to learn how to use Excel to solve statistical problems. All of the statistics problems in this book will come from the following fields of study: business, education, psychology, marketing, engineering and advertising. Students will learn how to perform key statistical tests in Excel without being overwhelmed by statistical theory. Each chapter briefly explains a topic and then demonstrates how to use Excel commands and formulas to solve specific statistics problems. This book gives practice in using Excel in two different ways: (1) writing formulas (e.g., confidence interval about the mean, one-group t-test, two-group t-test, correlation) and (2) using Excel’s drop-down formula menus (e.g., simple linear regression, multiple correlations and multiple regression, and one-way ANOVA). Three practice problems are provided at the end of each chapter, along w...

  9. High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.

    Science.gov (United States)

    Algin, Abdullah; Senay, Mustafa

    2012-04-01

    An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.

  10. Statistical damage analysis of transverse cracking in high temperature composite laminates

    International Nuclear Information System (INIS)

    Sun Zuo; Daniel, I.M.; Luo, J.J.

    2003-01-01

    High temperature polymer composites are receiving special attention because of their potential applications to high speed transport airframe structures and aircraft engine components exposed to elevated temperatures. In this study, a statistical analysis was used to study the progressive transverse cracking in a typical high temperature composite. The mechanical properties of this unidirectional laminate were first characterized both at room and high temperatures. Damage mechanisms of transverse cracking in cross-ply laminates were studied by X-ray radiography at room temperature and in-test photography technique at high temperature. Since the tensile strength of unidirectional laminate along transverse direction was found to follow Weibull distribution, Monte Carlo simulation technique based on experimentally obtained parameters was applied to predict transverse cracking at different temperatures. Experiments and simulation showed that they agree well both at room temperature and 149 deg. C (stress free temperature) in terms of applied stress versus crack density. The probability density function (PDF) of transverse crack spacing considering statistical strength distribution was also developed, and good agreements with simulation and experimental results are reached. Finally, a generalized master curve that predicts the normalized applied stress versus normalized crack density for various lay-ups and various temperatures was established

  11. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  12. High-dimensional data: p >> n in mathematical statistics and bio-medical applications

    OpenAIRE

    Van De Geer, Sara A.; Van Houwelingen, Hans C.

    2004-01-01

    The workshop 'High-dimensional data: p >> n in mathematical statistics and bio-medical applications' was held at the Lorentz Center in Leiden from 9 to 20 September 2002. This special issue of Bernoulli contains a selection of papers presented at that workshop. ¶ The introduction of high-throughput micro-array technology to measure gene-expression levels and the publication of the pioneering paper by Golub et al. (1999) has brought to life a whole new branch of data analysis under the name of...

  13. First high-statistics and high-resolution recoil-ion data from the WITCH retardation spectrometer

    Science.gov (United States)

    Finlay, P.; Breitenfeldt, M.; Porobić, T.; Wursten, E.; Ban, G.; Beck, M.; Couratin, C.; Fabian, X.; Fléchard, X.; Friedag, P.; Glück, F.; Herlert, A.; Knecht, A.; Kozlov, V. Y.; Liénard, E.; Soti, G.; Tandecki, M.; Traykov, E.; Van Gorp, S.; Weinheimer, Ch.; Zákoucký, D.; Severijns, N.

    2016-07-01

    The first high-statistics and high-resolution data set for the integrated recoil-ion energy spectrum following the β^+ decay of 35Ar has been collected with the WITCH retardation spectrometer located at CERN-ISOLDE. Over 25 million recoil-ion events were recorded on a large-area multichannel plate (MCP) detector with a time-stamp precision of 2ns and position resolution of 0.1mm due to the newly upgraded data acquisition based on the LPC Caen FASTER protocol. The number of recoil ions was measured for more than 15 different settings of the retardation potential, complemented by dedicated background and half-life measurements. Previously unidentified systematic effects, including an energy-dependent efficiency of the main MCP and a radiation-induced time-dependent background, have been identified and incorporated into the analysis. However, further understanding and treatment of the radiation-induced background requires additional dedicated measurements and remains the current limiting factor in extracting a beta-neutrino angular correlation coefficient for 35Ar decay using the WITCH spectrometer.

  14. Simulation of statistical γ-spectra of highly excited rare earth nuclei

    International Nuclear Information System (INIS)

    Schiller, A.; Munos, G.; Guttormsen, M.; Bergholt, L.; Melby, E.; Rekstad, J.; Siem, S.; Tveter, T.S.

    1997-05-01

    The statistical γ-spectra of highly excited even-even rare earth nuclei are simulated applying appropriate level density and strength function to a given nucleus. Hindrance effects due to K-conservation are taken into account. Simulations are compared to experimental data from the 163 Dy( 3 He,α) 162 Dy and 173 Yb( 3 He,α) 172 Yb reactions. The influence of the K quantum number at higher energies is discussed. 21 refs., 7 figs., 2 tabs

  15. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  16. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  17. Statistics for products of traces of high powers of the frobenius class of hyperelliptic curves

    OpenAIRE

    Roditty-Gershon, Edva

    2011-01-01

    We study the averages of products of traces of high powers of the Frobenius class of hyperelliptic curves of genus g over a fixed finite field. We show that for increasing genus g, the limiting expectation of these products equals to the expectation when the curve varies over the unitary symplectic group USp(2g). We also consider the scaling limit of linear statistics for eigenphases of the Frobenius class of hyperelliptic curves, and show that their first few moments are Gaussian.

  18. THE STATISTICS OF RADIO ASTRONOMICAL POLARIMETRY: BRIGHT SOURCES AND HIGH TIME RESOLUTION

    International Nuclear Information System (INIS)

    Van Straten, W.

    2009-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal-broadening phenomenon in single-pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single- and giant-pulse polarimetry typically involves sources with large flux-densities and observations with high time-resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self-noise is shown to fully explain the excess polarization dispersion previously noted in single-pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes parameters may be used to derive mode-separated pulse profiles without any assumptions about the intrinsic degrees of mode polarization. Finally, utilizing the small-number statistics of the Stokes parameters, it is established that the degree of polarization of an unresolved pulse is fundamentally undefined; therefore, previous claims of highly polarized giant pulses are unsubstantiated.

  19. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  20. Model Accuracy Comparison for High Resolution Insar Coherence Statistics Over Urban Areas

    Science.gov (United States)

    Zhang, Yue; Fu, Kun; Sun, Xian; Xu, Guangluan; Wang, Hongqi

    2016-06-01

    The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR) images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR) coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  1. MODEL ACCURACY COMPARISON FOR HIGH RESOLUTION INSAR COHERENCE STATISTICS OVER URBAN AREAS

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2016-06-01

    Full Text Available The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  2. Statistical approach to predict compressive strength of high workability slag-cement mortars

    International Nuclear Information System (INIS)

    Memon, N.A.; Memon, N.A.; Sumadi, S.R.

    2009-01-01

    This paper reports an attempt made to develop empirical expressions to estimate/ predict the compressive strength of high workability slag-cement mortars. Experimental data of 54 mix mortars were used. The mortars were prepared with slag as cement replacement of the order of 0, 50 and 60%. The flow (workability) was maintained at 136+-3%. The numerical and statistical analysis was performed by using database computer software Microsoft Office Excel 2003. Three empirical mathematical models were developed to estimate/predict 28 days compressive strength of high workability slag cement-mortars with 0, 50 and 60% slag which predict the values accurate between 97 and 98%. Finally a generalized empirical mathematical model was proposed which can predict 28 days compressive strength of high workability mortars up to degree of accuracy 95%. (author)

  3. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  4. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  5. Infrared maritime target detection using the high order statistic filtering in fractional Fourier domain

    Science.gov (United States)

    Zhou, Anran; Xie, Weixin; Pei, Jihong

    2018-06-01

    Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.

  6. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  7. Statistical study of high-latitude plasma flow during magnetospheric substorms

    Directory of Open Access Journals (Sweden)

    G. Provan

    2004-11-01

    Full Text Available We have utilised the near-global imaging capabilities of the Northern Hemisphere SuperDARN radars, to perform a statistical superposed epoch analysis of high-latitude plasma flows during magnetospheric substorms. The study involved 67 substorms, identified using the IMAGE FUV space-borne auroral imager. A substorm co-ordinate system was developed, centred on the magnetic local time and magnetic latitude of substorm onset determined from the auroral images. The plasma flow vectors from all 67 intervals were combined, creating global statistical plasma flow patterns and backscatter occurrence statistics during the substorm growth and expansion phases. The commencement of the substorm growth phase was clearly observed in the radar data 18-20min before substorm onset, with an increase in the anti-sunward component of the plasma velocity flowing across dawn sector of the polar cap and a peak in the dawn-to-dusk transpolar voltage. Nightside backscatter moved to lower latitudes as the growth phase progressed. At substorm onset a flow suppression region was observed on the nightside, with fast flows surrounding the suppressed flow region. The dawn-to-dusk transpolar voltage increased from ~40kV just before substorm onset to ~75kV 12min after onset. The low-latitude return flow started to increase at substorm onset and continued to increase until 8min after onset. The velocity flowing across the polar-cap peaked 12-14min after onset. This increase in the flux of the polar cap and the excitation of large-scale plasma flow occurred even though the IMF Bz component was increasing (becoming less negative during most of this time. This study is the first to statistically prove that nightside reconnection creates magnetic flux and excites high-latitude plasma flow in a similar way to dayside reconnection and that dayside and nightside reconnection, are two separate time-dependent processes.

  8. Large-eddy simulation in a mixing tee junction: High-order turbulent statistics analysis

    International Nuclear Information System (INIS)

    Howard, Richard J.A.; Serre, Eric

    2015-01-01

    Highlights: • Mixing and thermal fluctuations in a junction are studied using large eddy simulation. • Adiabatic and conducting steel wall boundaries are tested. • Wall thermal fluctuations are not the same between the flow and the solid. • Solid thermal fluctuations cannot be predicted from the fluid thermal fluctuations. • High-order turbulent statistics show that the turbulent transport term is important. - Abstract: This study analyses the mixing and thermal fluctuations induced in a mixing tee junction with circular cross-sections when cold water flowing in a pipe is joined by hot water from a branch pipe. This configuration is representative of industrial piping systems in which temperature fluctuations in the fluid may cause thermal fatigue damage on the walls. Implicit large-eddy simulations (LES) are performed for equal inflow rates corresponding to a bulk Reynolds number Re = 39,080. Two different thermal boundary conditions are studied for the pipe walls; an insulating adiabatic boundary and a conducting steel wall boundary. The predicted flow structures show a satisfactory agreement with the literature. The velocity and thermal fields (including high-order statistics) are not affected by the heat transfer with the steel walls. However, predicted thermal fluctuations at the boundary are not the same between the flow and the solid, showing that solid thermal fluctuations cannot be predicted by the knowledge of the fluid thermal fluctuations alone. The analysis of high-order turbulent statistics provides a better understanding of the turbulence features. In particular, the budgets of the turbulent kinetic energy and temperature variance allows a comparative analysis of dissipation, production and transport terms. It is found that the turbulent transport term is an important term that acts to balance the production. We therefore use a priori tests to evaluate three different models for the triple correlation

  9. Data analysis in high energy physics. A practical guide to statistical methods

    International Nuclear Information System (INIS)

    Behnke, Olaf; Schoerner-Sadenius, Thomas; Kroeninger, Kevin; Schott, Gregory

    2013-01-01

    This practical guide covers the essential tasks in statistical data analysis encountered in high energy physics and provides comprehensive advice for typical questions and problems. The basic methods for inferring results from data are presented as well as tools for advanced tasks such as improving the signal-to-background ratio, correcting detector effects, determining systematics and many others. Concrete applications are discussed in analysis walkthroughs. Each chapter is supplemented by numerous examples and exercises and by a list of literature and relevant links. The book targets a broad readership at all career levels - from students to senior researchers.

  10. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  11. High-resolution Statistics of Solar Wind Turbulence at Kinetic Scales Using the Magnetospheric Multiscale Mission

    Energy Technology Data Exchange (ETDEWEB)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.; Maruca, B. A. [University of Delaware, Newark, DE (United States); Fuselier, S. A.; Burch, J. L. [Southwest Research Institute, San Antonio, TX (United States); Phan, T. D. [Space Sciences Laboratory, University of California, Berkeley, CA (United States); Moore, T. E.; Pollock, C. J.; Gershman, D. J. [NASA Goddard Space Flight Center, Greenbelt, MD (United States); Torbert, R. B. [University of New Hampshire, Durham, NH (United States); Russell, C. T.; Strangeway, R. J., E-mail: chasapis@udel.edu [University of California, Los Angeles, CA (United States)

    2017-07-20

    Using data from the Magnetospheric Multiscale (MMS) and Cluster missions obtained in the solar wind, we examine second-order and fourth-order structure functions at varying spatial lags normalized to ion inertial scales. The analysis includes direct two-spacecraft results and single-spacecraft results employing the familiar Taylor frozen-in flow approximation. Several familiar statistical results, including the spectral distribution of energy, and the sale-dependent kurtosis, are extended down to unprecedented spatial scales of ∼6 km, approaching electron scales. The Taylor approximation is also confirmed at those small scales, although small deviations are present in the kinetic range. The kurtosis is seen to attain very high values at sub-proton scales, supporting the previously reported suggestion that monofractal behavior may be due to high-frequency plasma waves at kinetic scales.

  12. An instrument for the high-statistics measurement of plastic scintillating fibers

    International Nuclear Information System (INIS)

    Buontempo, S.; Ereditato, A.; Marchetti-Stasi, F.; Riccardi, F.; Strolin, P.

    1994-01-01

    There is today widespread use of plastic scintillating fibers in particle physics, mainly for calorimetric and tracking applications. In the case of calorimeters, we have to cope with very massive detectors and a large quantity of scintillating fibers. The CHORUS Collaboration has built a new detector to search for ν μ -ν τ oscillations in the CERN neutrino beam. A crucial task of the detector is ruled by the high-energy resolution calorimeter. For its construction more than 400 000 scintillating plastic fibers have been used. In this paper we report on the design and performance of a new instrument for the high-statistics measurement of the fiber properties, in terms of light yield and light attenuation length. The instrument has been successfully used to test about 3% of the total number of fibers before the construction of the calorimeter. ((orig.))

  13. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  14. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    Energy Technology Data Exchange (ETDEWEB)

    Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)

    2013-02-15

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  15. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    Science.gov (United States)

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  16. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    International Nuclear Information System (INIS)

    Durán-Lobato, Matilde; Enguix-González, Alicia; Fernández-Arévalo, Mercedes; Martín-Banderas, Lucía

    2013-01-01

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 μm, negative zeta potential under –30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R L/S ) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R L/S , while the number of passes applied mainly determined polydispersion. α-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  17. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  18. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  19. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution.

    All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity.

    An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to

  20. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution. All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity. An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to auroral arcs as a result of ionosphere-magnetosphere coupling, as discussed by Aikio et al. (2004 In

  1. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  2. Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data

    Science.gov (United States)

    Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.

    2018-01-01

    One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.

  3. Statistical surrogate models for prediction of high-consequence climate change.

    Energy Technology Data Exchange (ETDEWEB)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.

  4. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Science.gov (United States)

    Wuensch, Walter; Degiovanni, Alberto; Calatroni, Sergio; Korsbäck, Anders; Djurabekova, Flyura; Rajamäki, Robin; Giner-Navarro, Jorge

    2017-01-01

    In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  5. Matching of experimental and statistical-model thermonuclear reaction rates at high temperatures

    International Nuclear Information System (INIS)

    Newton, J. R.; Longland, R.; Iliadis, C.

    2008-01-01

    We address the problem of extrapolating experimental thermonuclear reaction rates toward high stellar temperatures (T>1 GK) by using statistical model (Hauser-Feshbach) results. Reliable reaction rates at such temperatures are required for studies of advanced stellar burning stages, supernovae, and x-ray bursts. Generally accepted methods are based on the concept of a Gamow peak. We follow recent ideas that emphasized the fundamental shortcomings of the Gamow peak concept for narrow resonances at high stellar temperatures. Our new method defines the effective thermonuclear energy range (ETER) by using the 8th, 50th, and 92nd percentiles of the cumulative distribution of fractional resonant reaction rate contributions. This definition is unambiguous and has a straightforward probability interpretation. The ETER is used to define a temperature at which Hauser-Feshbach rates can be matched to experimental rates. This matching temperature is usually much higher compared to previous estimates that employed the Gamow peak concept. We suggest that an increased matching temperature provides more reliable extrapolated reaction rates since Hauser-Feshbach results are more trustwhorthy the higher the temperature. Our ideas are applied to 21 (p,γ), (p,α), and (α,γ) reactions on A=20-40 target nuclei. For many of the cases studied here, our extrapolated reaction rates at high temperatures differ significantly from those obtained using the Gamow peak concept

  6. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Directory of Open Access Journals (Sweden)

    Walter Wuensch

    2017-01-01

    Full Text Available In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  7. A High-resolution Atlas and Statistical Model of the Vocal Tract from Structural MRI.

    Science.gov (United States)

    Woo, Jonghye; Lee, Junghoon; Murano, Emi Z; Xing, Fangxu; Al-Talib, Meena; Stone, Maureen; Prince, Jerry L

    Magnetic resonance imaging (MRI) is an essential tool in the study of muscle anatomy and functional activity in the tongue. Objective assessment of similarities and differences in tongue structure and function has been performed using unnormalized data, but this is biased by the differences in size, shape, and orientation of the structures. To remedy this, we propose a methodology to build a 3D vocal tract atlas based on structural MRI volumes from twenty normal subjects. We first constructed high-resolution volumes from three orthogonal stacks. We then removed extraneous data so that all 3D volumes contained the same anatomy. We used an unbiased diffeomorphic groupwise registration using a cross-correlation similarity metric. Principal component analysis was applied to the deformation fields to create a statistical model from the atlas. Various evaluations and applications were carried out to show the behaviour and utility of the atlas.

  8. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  9. Geant4 electromagnetic physics for high statistic simulation of LHC experiments

    CERN Document Server

    Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L

    2012-01-01

    An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.

  10. The Statistical Analysis of Relation between Compressive and Tensile/Flexural Strength of High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Kępniak M.

    2016-12-01

    Full Text Available This paper addresses the tensile and flexural strength of HPC (high performance concrete. The aim of the paper is to analyse the efficiency of models proposed in different codes. In particular, three design procedures from: the ACI 318 [1], Eurocode 2 [2] and the Model Code 2010 [3] are considered. The associations between design tensile strength of concrete obtained from these three codes and compressive strength are compared with experimental results of tensile strength and flexural strength by statistical tools. Experimental results of tensile strength were obtained in the splitting test. Based on this comparison, conclusions are drawn according to the fit between the design methods and the test data. The comparison shows that tensile strength and flexural strength of HPC depend on more influential factors and not only compressive strength.

  11. Dissipative Effects on Inertial-Range Statistics at High Reynolds Numbers.

    Science.gov (United States)

    Sinhuber, Michael; Bewley, Gregory P; Bodenschatz, Eberhard

    2017-09-29

    Using the unique capabilities of the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, we report experimental measurements in classical grid turbulence that uncover oscillations of the velocity structure functions in the inertial range. This was made possible by measuring extremely long time series of up to 10^{10} samples of the turbulent fluctuating velocity, which corresponds to O(10^{7}) integral length scales. The measurements were conducted in a well-controlled environment at a wide range of high Reynolds numbers from R_{λ}=110 up to R_{λ}=1600, using both traditional hot-wire probes as well as the nanoscale thermal anemometry probe developed at Princeton University. An implication of the observed oscillations is that dissipation influences the inertial-range statistics of turbulent flows at scales significantly larger than predicted by current models and theories.

  12. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  13. Statistical dynamic image reconstruction in state-of-the-art high-resolution PET

    International Nuclear Information System (INIS)

    Rahmim, Arman; Cheng, J-C; Blinder, Stephan; Camborde, Maurie-Laure; Sossi, Vesna

    2005-01-01

    Modern high-resolution PET is now more than ever in need of scrutiny into the nature and limitations of the imaging modality itself as well as image reconstruction techniques. In this work, we have reviewed, analysed and addressed the following three considerations within the particular context of state-of-the-art dynamic PET imaging: (i) the typical average numbers of events per line-of-response (LOR) are now (much) less than unity (ii) due to the physical and biological decay of the activity distribution, one requires robust and efficient reconstruction algorithms applicable to a wide range of statistics and (iii) the computational considerations in dynamic imaging are much enhanced (i.e., more frames to be stored and reconstructed). Within the framework of statistical image reconstruction, we have argued theoretically and shown experimentally that the sinogram non-negativity constraint (when using the delayed-coincidence and/or scatter-subtraction techniques) is especially expected to result in an overestimation bias. Subsequently, two schemes are considered: (a) subtraction techniques in which an image non-negativity constraint has been imposed and (b) implementation of random and scatter estimates inside the reconstruction algorithms, thus enabling direct processing of Poisson-distributed prompts. Both techniques are able to remove the aforementioned bias, while the latter, being better conditioned theoretically, is able to exhibit superior noise characteristics. We have also elaborated upon and verified the applicability of the accelerated list-mode image reconstruction method as a powerful solution for accurate, robust and efficient dynamic reconstructions of high-resolution data (as well as a number of additional benefits in the context of state-of-the-art PET)

  14. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  15. Integration of statistical modeling and high-content microscopy to systematically investigate cell-substrate interactions.

    Science.gov (United States)

    Chen, Wen Li Kelly; Likhitpanichkul, Morakot; Ho, Anthony; Simmons, Craig A

    2010-03-01

    Cell-substrate interactions are multifaceted, involving the integration of various physical and biochemical signals. The interactions among these microenvironmental factors cannot be facilely elucidated and quantified by conventional experimentation, and necessitate multifactorial strategies. Here we describe an approach that integrates statistical design and analysis of experiments with automated microscopy to systematically investigate the combinatorial effects of substrate-derived stimuli (substrate stiffness and matrix protein concentration) on mesenchymal stem cell (MSC) spreading, proliferation and osteogenic differentiation. C3H10T1/2 cells were grown on type I collagen- or fibronectin-coated polyacrylamide hydrogels with tunable mechanical properties. Experimental conditions, which were defined according to central composite design, consisted of specific permutations of substrate stiffness (3-144 kPa) and adhesion protein concentration (7-520 microg/mL). Spreading area, BrdU incorporation and Runx2 nuclear translocation were quantified using high-content microscopy and modeled as mathematical functions of substrate stiffness and protein concentration. The resulting response surfaces revealed distinct patterns of protein-specific, substrate stiffness-dependent modulation of MSC proliferation and differentiation, demonstrating the advantage of statistical modeling in the detection and description of higher-order cellular responses. In a broader context, this approach can be adapted to study other types of cell-material interactions and can facilitate the efficient screening and optimization of substrate properties for applications involving cell-material interfaces. Copyright 2009 Elsevier Ltd. All rights reserved.

  16. Waste generated in high-rise buildings construction: a quantification model based on statistical multiple regression.

    Science.gov (United States)

    Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana

    2015-05-01

    Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Statistical characteristics of transient enclosure voltage in ultra-high-voltage gas-insulated switchgear

    Science.gov (United States)

    Cai, Yuanji; Guan, Yonggang; Liu, Weidong

    2017-06-01

    Transient enclosure voltage (TEV), which is a phenomenon induced by the inner dielectric breakdown of SF6 during disconnector operations in a gas-insulated switchgear (GIS), may cause issues relating to shock hazard and electromagnetic interference to secondary equipment. This is a critical factor regarding the electromagnetic compatibility of ultra-high-voltage (UHV) substations. In this paper, the statistical characteristics of TEV at UHV level are collected from field experiments, and are analyzed and compared to those from a repeated strike process. The TEV waveforms during disconnector operations are recorded by a self-developed measurement system first. Then, statistical characteristics, such as the pulse number, duration of pulses, frequency components, magnitude and single pulse duration, are extracted. The transmission line theory is introduced to analyze the TEV and is validated by the experimental results. Finally, the relationship between the TEV and the repeated strike process is analyzed. This proves that the pulse voltage of the TEV is proportional to the corresponding breakdown voltage. The results contribute to the definition of the standard testing waveform of the TEV, and can aid the protection of electronic devices in substations by minimizing the threat of this phenomenon.

  18. Effectiveness of mouse minute virus inactivation by high temperature short time treatment technology: a statistical assessment.

    Science.gov (United States)

    Murphy, Marie; Quesada, Guillermo Miro; Chen, Dayue

    2011-11-01

    Viral contamination of mammalian cell cultures in GMP manufacturing facility represents a serious safety threat to biopharmaceutical industry. Such adverse events usually require facility shutdown for cleaning/decontamination, and thus result in significant loss of production and/or delay of product development. High temperature short time (HTST) treatment of culture media has been considered as an effective method to protect GMP facilities from viral contaminations. Log reduction factor (LRF) has been commonly used to measure the effectiveness of HTST treatment for viral inactivation. However, in order to prevent viral contaminations, HTST treatment must inactivate all infectious viruses (100%) in the medium batch since a single virus is sufficient to cause contamination. Therefore, LRF may not be the most appropriate indicator for measuring the effectiveness of HTST in preventing viral contaminations. We report here the use of the probability to achieve complete (100%) virus inactivation to assess the effectiveness of HTST treatment. By using mouse minute virus (MMV) as a model virus, we have demonstrated that the effectiveness of HTST treatment highly depends upon the level of viral contaminants in addition to treatment temperature and duration. We believe that the statistical method described in this report can provide more accurate information about the power and potential limitation of technologies such as HTST in our shared quest to mitigate the risk of viral contamination in manufacturing facilities. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  19. On the efficiency of high-energy particle identification statistical methods

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1982-01-01

    An attempt is made to analyze the statistical methods of making decisions on the high-energy particle identification. The Bayesian approach is shown to provide the most complete account of the primary discriminative information between the particles of various tupes. It does not impose rigid requirements on the density form of the probability function and ensures the account of the a priori information as compared with the Neyman-Pearson approach, the mimimax technique and the heristic rules of the decision limits construction in the variant region of the specially chosen parameter. The methods based on the concept of the nearest neighbourhood are shown to be the most effective one among the local methods of the probability function density estimation. The probability distances between the training sample classes are suggested to make a decision on selecting the high-energy particle detector optimal parameters. The method proposed and the software constructed are tested on the problem of the cosmic radiation hadron identification by means of transition radiation detectors (the ''PION'' experiment)

  20. Statistics of Deep Convection in the Congo Basin Derived From High-Resolution Simulations.

    Science.gov (United States)

    White, B.; Stier, P.; Kipling, Z.; Gryspeerdt, E.; Taylor, S.

    2016-12-01

    Convection transports moisture, momentum, heat and aerosols through the troposphere, and so the temporal variability of convection is a major driver of global weather and climate. The Congo basin is home to some of the most intense convective activity on the planet and is under strong seasonal influence of biomass burning aerosol. However, deep convection in the Congo basin remains under studied compared to other regions of tropical storm systems, especially when compared to the neighbouring, relatively well-understood West African climate system. We use the WRF model to perform a high-resolution, cloud-system resolving simulation to investigate convective storm systems in the Congo. Our setup pushes the boundaries of current computational resources, using a 1 km grid length over a domain covering millions of square kilometres and for a time period of one month. This allows us to draw statistical conclusions on the nature of the simulated storm systems. Comparing data from satellite observations and the model enables us to quantify the diurnal variability of deep convection in the Congo basin. This approach allows us to evaluate our simulations despite the lack of in-situ observational data. This provides a more comprehensive analysis of the diurnal cycle than has previously been shown. Further, we show that high-resolution convection-permitting simulations performed over near-seasonal timescales can be used in conjunction with satellite observations as an effective tool to evaluate new convection parameterisations.

  1. Cluster survey of the high-altitude cusp properties: a three-year statistical study

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2004-09-01

    Full Text Available The global characteristics of the high-altitude cusp and its surrounding regions are investigated using a three-year statistical survey based on data obtained by the Cluster spacecraft. The analysis involves an elaborate orbit-sampling methodology that uses a model field and takes into account the actual solar wind conditions and level of geomagnetic activity. The spatial distribution of the magnetic field and various plasma parameters in the vicinity of the low magnetic field exterior cusp are determined and it is found that: 1 The magnetic field distribution shows the presence of an intermediate region between the magnetosheath and the magnetosphere: the exterior cusp, 2 This region is characterized by the presence of dense plasma of magnetosheath origin; a comparison with the Tsyganenko (1996 magnetic field model shows that it is diamagnetic in nature, 3 The spatial distributions show that three distinct boundaries with the lobes, the dayside plasma sheet and the magnetosheath surround the exterior cusp, 4 The external boundary with the magnetosheath has a sharp bulk velocity gradient, as well as a density decrease and temperature increase as one goes from the magnetosheath to the exterior cusp, 5 While the two inner boundaries form a funnel, the external boundary shows no clear indentation, 6 The plasma and magnetic pressure distributions suggest that the exterior cusp is in equilibrium with its surroundings in a statistical sense, and 7 A preliminary analysis of the bulk flow distributions suggests that the exterior cusp is stagnant under northward IMF conditions but convective under southward IMF conditions.

  2. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  3. A statistical study of high-altitude electric fields measured on the Viking satellite

    International Nuclear Information System (INIS)

    Lindqvist, P.A.; Marklund, G.T.

    1990-01-01

    Characteristics of high-altitude data from the Viking electric field instrument are presented in a statistical study based on 109 Viking orbits. The study is focused in particular on the signatures of and relationships between various parameters measured by the electric field instrument, such as the parallel and transverse (to B) components of the electric field instrument, such as electric field variability. A major goal of the Viking mission was to investigate the occurrence and properties of parallel electric fields and their role in the auroral acceleration process. The results in this paper on the altitude distribution of the electric field variability confirm earlier findings on the distribution of small-scale electric fields and indicate the presence of parallel fields up to about 11,000 km altitude. The directly measured parallel electric field is also investigated in some detail. It is in general directed upward with an average value of 1 mV/m, but depends on, for example, altitude and plasma density. Possible sources of error in the measurement of the parallel field are also considered and accounted for

  4. Statistical properties of highly excited quantum eigenstates of a strongly chaotic system

    International Nuclear Information System (INIS)

    Aurich, R.; Steiner, F.

    1992-06-01

    Statistical properties of highly excited quantal eigenstates are studied for the free motion (geodesic flow) on a compact surface of constant negative curvature (hyperbolic octagon) which represents a strongly chaotic system (K-system). The eigenstates are expanded in a circular-wave basis, and it turns out that the expansion coefficients behave as Gaussian pseudo-random numbers. It is shown that this property leads to a Gaussian amplitude distribution P(ψ) in the semiclassical limit, i.e. the wavefunctions behave as Gaussian random functions. This behaviour, which should hold for chaotic systems in general, is nicely confirmed for eigenstates lying 10000 states above the ground state thus probing the semiclassical limit. In addition, the autocorrelation function and the path-correlation function are calculated and compared with a crude semiclassical Bessel-function approximation. Agreement with the semiclassical prediction is only found, if a local averaging is performed over roughly 1000 de Broglie wavelengths. On smaller scales, the eigenstates show much more structure than predicted by the first semiclassical approximation. (orig.)

  5. Statistical list-mode image reconstruction for the high resolution research tomograph

    International Nuclear Information System (INIS)

    Rahmim, A; Lenox, M; Reader, A J; Michel, C; Burbar, Z; Ruth, T J; Sossi, V

    2004-01-01

    We have investigated statistical list-mode reconstruction applicable to a depth-encoding high resolution research tomograph. An image non-negativity constraint has been employed in the reconstructions and is shown to effectively remove the overestimation bias introduced by the sinogram non-negativity constraint. We have furthermore implemented a convergent subsetized (CS) list-mode reconstruction algorithm, based on previous work (Hsiao et al 2002 Conf. Rec. SPIE Med. Imaging 4684 10-19; Hsiao et al 2002 Conf. Rec. IEEE Int. Symp. Biomed. Imaging 409-12) on convergent histogram OSEM reconstruction. We have demonstrated that the first step of the convergent algorithm is exactly equivalent (unlike the histogram-mode case) to the regular subsetized list-mode EM algorithm, while the second and final step takes the form of additive updates in image space. We have shown that in terms of contrast, noise as well as FWHM width behaviour, the CS algorithm is robust and does not result in limit cycles. A hybrid algorithm based on the ordinary and the convergent algorithms is also proposed, and is shown to combine the advantages of the two algorithms (i.e. it is able to reach a higher image quality in fewer iterations while maintaining the convergent behaviour), making the hybrid approach a good alternative to the ordinary subsetized list-mode EM algorithm

  6. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, H.; Waara, M.; Arvelius, S.; Yamauchi, M.; Lundin, R. [Inst. of Space Physics, Kiruna (Sweden); Marghitu, O. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Inst. for Space Sciences, Bucharest (Romania); Bouhram, M. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); CETP-CNRS, Saint-Maur (France); Hobara, Y. [Inst. of Space Physics, Kiruna (Sweden); Univ. of Sheffield, Sheffield (United Kingdom); Reme, H.; Sauvaud, J.A.; Dandouras, I. [Centre d' Etude Spatiale des Rayonnements, Toulouse (France); Balogh, A. [Imperial Coll. of Science, Technology and Medicine, London (United Kingdom); Kistler, L.M. [Univ. of New Hampshire, Durham (United States); Klecker, B. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Carlson, C.W. [Space Science Lab., Univ. of California, Berkeley (United States); Bavassano-Cattaneo, M.B. [Ist. di Fisica dello Spazio Interplanetario, Roma (Italy); Korth, A. [Max-Planck-Inst. fuer Sonnensystemforschung, Katlenburg-Lindau (Germany)

    2006-07-01

    The results of a statistical study of oxygen ion outflow using cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H{sup +}) and oxygen ions (O{sup +}) from 3 years (2001-2003) of spring orbits (January to May) have been used. The altitudes covered were mainly in the range 5-12 R{sub E} geocentric distance. It was found that O{sup +} is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O{sup +} parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O{sup +} parallel bulk velocities in excess of 60 km s{sup -1} were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O{sup +} the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H{sup +} and O{sup +} was found to typically be close to the same throughout the observation interval when the H{sup +} bulk velocity was calculated for all pitch-angles. When the H{sup +} bulk velocity was calculated for upward moving particles only the H{sup +} parallel bulk velocity was typically higher than that of O{sup +}. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O{sup +} ions dominates. The thermal velocity of O{sup +} was always well below that of H{sup +}. Thus perpendicular energization that is more effective for O{sup +} takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel

  7. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further

  8. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel acceleration. In particular centrifugal acceleration of the outflowing ions, which may

  9. Task-based statistical image reconstruction for high-quality cone-beam CT

    Science.gov (United States)

    Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.

    2017-11-01

    Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a

  10. A statistical study towards high-mass BGPS clumps with the MALT90 survey

    Science.gov (United States)

    Liu, Xiao-Lan; Xu, Jin-Long; Ning, Chang-Chun; Zhang, Chuan-Peng; Liu, Xiao-Tao

    2018-01-01

    In this work, we perform a statistical investigation towards 50 high-mass clumps using data from the Bolocam Galactic Plane Survey (BGPS) and Millimetre Astronomy Legacy Team 90-GHz survey (MALT90). Eleven dense molecular lines (N2H+(1–0), HNC(1–0), HCO+(1–0), HCN(1–0), HN13C(1–0), H13CO+(1–0), C2H(1–0), HC3N(10–9), SiO(2–1), 13CS(2–1)and HNCO(44,0 ‑ 30,3)) are detected. N2H+ and HNC are shown to be good tracers for clumps in various evolutionary stages since they are detected in all the fields. The detection rates of N-bearing molecules decrease as the clumps evolve, but those of O-bearing species increase with evolution. Furthermore, the abundance ratios [N2H+]/[HCO+] and log([HC3N]/[HCO+]) decline with log([HCO+]) as two linear functions, respectively. This suggests that N2H+ and HC3N transform to HCO+ as the clumps evolve. We also find that C2H is the most abundant molecule with an order of magnitude 10‑8. In addition, three new infall candidates, G010.214–00.324, G011.121–00.128 and G012.215–00.118(a), are discovered to have large-scale infall motions and infall rates with an order of magnitude 10‑3 M ⊙ yr‑1.

  11. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  12. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  13. High frequency statistical energy analysis applied to fluid filled pipe systems

    NARCIS (Netherlands)

    Beek, P.J.G. van; Smeulers, J.P.M.

    2013-01-01

    In pipe systems, carrying gas with high velocities, broadband turbulent pulsations can be generated causing strong vibrations and fatigue failure, called Acoustic Fatigue. This occurs at valves with high pressure differences (i.e. chokes), relief valves and obstructions in the flow, such as sharp

  14. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    Science.gov (United States)

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  15. A Statist Political Economy and High Demand for Education in South Korea

    Directory of Open Access Journals (Sweden)

    Ki Su Kim

    1999-06-01

    Full Text Available In the 1998 academic year, 84 percent of South Korea's high school "leavers" entered a university or college while almost all children went up to high schools. This is to say, South Korea is now moving into a new age of universal higher education. Even so, competition for university entrance remains intense. What is here interesting is South Koreans' unusually high demand for education. In this article, I criticize the existing cultural and socio-economic interpretations of the phenomenon. Instead, I explore a new interpretation by critically referring to the recent political economy debate on South Korea's state-society/market relationship. In my interpretation, the unusually high demand for education is largely due to the powerful South Korean state's losing flexibility in the management of its "developmental" policies. For this, I blame the traditional "personalist ethic" which still prevails as the

  16. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  17. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  18. Estimating annual high-flow statistics and monthly and seasonal low-flow statistics for ungaged sites on streams in Alaska and conterminous basins in Canada

    Science.gov (United States)

    Wiley, Jeffrey B.; Curran, Janet H.

    2003-01-01

    Methods for estimating daily mean flow-duration statistics for seven regions in Alaska and low-flow frequencies for one region, southeastern Alaska, were developed from daily mean discharges for streamflow-gaging stations in Alaska and conterminous basins in Canada. The 15-, 10-, 9-, 8-, 7-, 6-, 5-, 4-, 3-, 2-, and 1-percent duration flows were computed for the October-through-September water year for 222 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the individual months of July, August, and September for 226 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the season July-through-September for 65 stations in southeastern Alaska. The 7-day, 10-year and 7-day, 2-year low-flow frequencies for the season July-through-September were computed for 65 stations for most of southeastern Alaska. Low-flow analyses were limited to particular months or seasons in order to omit winter low flows, when ice effects reduce the quality of the records and validity of statistical assumptions. Regression equations for estimating the selected high-flow and low-flow statistics for the selected months and seasons for ungaged sites were developed from an ordinary-least-squares regression model using basin characteristics as independent variables. Drainage area and precipitation were significant explanatory variables for high flows, and drainage area, precipitation, mean basin elevation, and area of glaciers were significant explanatory variables for low flows. The estimating equations can be used at ungaged sites in Alaska and conterminous basins in Canada where streamflow regulation, streamflow diversion, urbanization, and natural damming and releasing of water do not affect the streamflow data for the given month or season. Standard errors of estimate ranged from 15 to 56 percent for high-duration flow

  19. The Nonlinear Statistics of High-contrast Patches in Natural Images

    DEFF Research Database (Denmark)

    Lee, Ann; Pedersen, Kim Steenstrup; Mumford, David

    2003-01-01

    described. In this study, we explore the space of data points representing the values of 3 × 3 high-contrast patches from optical and 3D range images. We find that the distribution of data is extremely sparse with the majority of the data points concentrated in clusters and non-linear low...

  20. High blood levels of persistent organic pollutants are statistically correlated with smoking

    DEFF Research Database (Denmark)

    Deutch, Bente; Hansen, Jens C.

    1999-01-01

    , smoking and intake of traditional Inuit food. Multiple linear regression analyses showed highly significant positive associations between the mothers' smoking status (never, previous, present) and plasma concentrations of all the studied organic pollutants both in maternal blood and umbilical cord blood...

  1. Gauge invariant lattice quantum field theory: Implications for statistical properties of high frequency financial markets

    Science.gov (United States)

    Dupoyet, B.; Fiebig, H. R.; Musgrove, D. P.

    2010-01-01

    We report on initial studies of a quantum field theory defined on a lattice with multi-ladder geometry and the dilation group as a local gauge symmetry. The model is relevant in the cross-disciplinary area of econophysics. A corresponding proposal by Ilinski aimed at gauge modeling in non-equilibrium pricing is implemented in a numerical simulation. We arrive at a probability distribution of relative gains which matches the high frequency historical data of the NASDAQ stock exchange index.

  2. On understanding crosstalk in the face of small, quantized, signals highly smeared by Poisson statistics

    International Nuclear Information System (INIS)

    Lincoln, D.; Hsieh, F.; Li, H.

    1995-01-01

    As detectors become smaller and more densely packed, signals become smaller and crosstalk between adjacent channels generally increases. Since it is often appropriate to use the distribution of signals in adjacent channels to make a useful measurement, it is imperative that inter-channel crosstalk be well understood. In this paper we shall describe the manner in which Poissonian fluctuations can give counter-intuitive results and offer some methods for extracting the desired information from the highly smeared, observed distributions. (orig.)

  3. Number projected statistics and the pairing correlations at high excitation energies

    International Nuclear Information System (INIS)

    Esebbag, C.; Egido, J.L.

    1993-01-01

    We analyze the use of particle-number projected statistics (PNPS) as an effective way to include the quantum and statistical fluctuations, associated with the pairing degree of freedom, left out in finite-temperature mean-field theories. As a numerical application the exact-soluble degenerate model is worked out. In particular, we find that the sharp temperature-induced superfluid-normal phase transition, predicted in the mean-field approximations, is washed out in the PNPS. Some approximations as well as the Landau prescription to include statistical fluctuations are also discussed. We find that the Landau prescription provides a reasonable approximation to the PNPS. (orig.)

  4. A New Statistical Approach to Characterize Chemical-Elicited Behavioral Effects in High-Throughput Studies Using Zebrafish.

    Directory of Open Access Journals (Sweden)

    Guozhu Zhang

    Full Text Available Zebrafish have become an important alternative model for characterizing chemical bioactivity, partly due to the efficiency at which systematic, high-dimensional data can be generated. However, these new data present analytical challenges associated with scale and diversity. We developed a novel, robust statistical approach to characterize chemical-elicited effects in behavioral data from high-throughput screening (HTS of all 1,060 Toxicity Forecaster (ToxCast™ chemicals across 5 concentrations at 120 hours post-fertilization (hpf. Taking advantage of the immense scale of data for a global view, we show that this new approach reduces bias introduced by extreme values yet allows for diverse response patterns that confound the application of traditional statistics. We have also shown that, as a summary measure of response for local tests of chemical-associated behavioral effects, it achieves a significant reduction in coefficient of variation compared to many traditional statistical modeling methods. This effective increase in signal-to-noise ratio augments statistical power and is observed across experimental periods (light/dark conditions that display varied distributional response patterns. Finally, we integrated results with data from concomitant developmental endpoint measurements to show that appropriate statistical handling of HTS behavioral data can add important biological context that informs mechanistic hypotheses.

  5. Statistical Identification of Composed Visual Features Indicating High Likelihood of Grasp Success

    DEFF Research Database (Denmark)

    Thomsen, Mikkel Tang; Bodenhagen, Leon; Krüger, Norbert

    2013-01-01

    configurations of three 3D surface features that predict grasping actions with a high success probability. The strategy is based on first computing spatial relations between visual entities and secondly, exploring the cross-space of these relational feature space and grasping actions. The data foundation...... for identifying such indicative feature constellations is generated in a simulated environment wherein visual features are extracted and a large amount of grasping actions are evaluated through dynamic simulation. Based on the identified feature constellations, we validate by applying the acquired knowledge...

  6. Application of the non-extensive statistical approach to high energy particle collisions

    Science.gov (United States)

    Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Ürmössy, Károly

    2017-06-01

    In high-energy collisions the number of created particles is far less than the thermodynamic limit, especially in small colliding systems (e.g. proton-proton). Therefore final-state effects and fluctuations in the one-particle energy distribution are appreciable. As a consequence the characterization of identified hadron spectra with the Boltzmann - Gibbs thermodynamical approach is insuffcient [1]. Instead particle spectra measured in high-energy collisions can be described very well with Tsallis -Pareto distributions, derived from non-extensive thermodynamics [2, 3]. Using the Tsallis q-entropy formula, a generalization of the Boltzmann - Gibbs entropy, we interpret the microscopic physics by analysing the Tsallis q and T parameters. In this paper we give a quick overview on these parameters, analyzing identified hadron spectra from recent years in a wide center-of-mass energy range. We demonstrate that the fitted Tsallis-parameters show dependency on the center-of-mass energy and particle species. Our findings are described well by a QCD inspired evolution ansatz. Based on this comprehensive study, apart from the evolution, both mesonic and barionic components found to be non-extensive (q > 1), beside the mass ordered hierarchy observed in parameter T.

  7. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  8. Statistical study of overvoltages by maneuvering in switches in high voltage using EMTP-RV

    International Nuclear Information System (INIS)

    Dominguez Herrera, Diego Armando

    2013-01-01

    The transient overvoltages produced by maneuvering of switches are studied in a statistical way and through a variation the sequential closing times of switches in networks larger than 230 kV. This study is performed according to time delays and typical deviation ranges, using the tool EMTP- RV (ElectroMagnetic Trasient Program Restructured Version). A conceptual framework related with the electromagnetic transients by maneuver is developed in triphasic switches installed in nominal voltages higher than 230 kV. The methodology established for the execution of statistical studies of overvoltages by switch maneuver is reviewed and evaluated by simulating two fictitious cases in EMTP-RV [es

  9. Geo-statistical model of Rainfall erosivity by using high temporal resolution precipitation data in Europe

    Science.gov (United States)

    Panagos, Panos; Ballabio, Cristiano; Borrelli, Pasquale; Meusburger, Katrin; Alewell, Christine

    2015-04-01

    Rainfall erosivity (R-factor) is among the 6 input factors in estimating soil erosion risk by using the empirical Revised Universal Soil Loss Equation (RUSLE). R-factor is a driving force for soil erosion modelling and potentially can be used in flood risk assessments, landslides susceptibility, post-fire damage assessment, application of agricultural management practices and climate change modelling. The rainfall erosivity is extremely difficult to model at large scale (national, European) due to lack of high temporal resolution precipitation data which cover long-time series. In most cases, R-factor is estimated based on empirical equations which take into account precipitation volume. The Rainfall Erosivity Database on the European Scale (REDES) is the output of an extensive data collection of high resolution precipitation data in the 28 Member States of the European Union plus Switzerland taking place during 2013-2014 in collaboration with national meteorological/environmental services. Due to different temporal resolutions of the data (5, 10, 15, 30, 60 minutes), conversion equations have been applied in order to homogenise the database at 30-minutes interval. The 1,541 stations included in REDES have been interpolated using the Gaussian Process Regression (GPR) model using as covariates the climatic data (monthly precipitation, monthly temperature, wettest/driest month) from WorldClim Database, Digital Elevation Model and latitude/longitude. GPR has been selected among other candidate models (GAM, Regression Kriging) due the best performance both in cross validation (R2=0.63) and in fitting dataset (R2=0.72). The highest uncertainty has been noticed in North-western Scotland, North Sweden and Finland due to limited number of stations in REDES. Also, in highlands such as Alpine arch and Pyrenees the diversity of environmental features forced relatively high uncertainty. The rainfall erosivity map of Europe available at 500m resolution plus the standard error

  10. The Statistics and Mathematics of High Dimension Low Sample Size Asymptotics.

    Science.gov (United States)

    Shen, Dan; Shen, Haipeng; Zhu, Hongtu; Marron, J S

    2016-10-01

    The aim of this paper is to establish several deep theoretical properties of principal component analysis for multiple-component spike covariance models. Our new results reveal an asymptotic conical structure in critical sample eigendirections under the spike models with distinguishable (or indistinguishable) eigenvalues, when the sample size and/or the number of variables (or dimension) tend to infinity. The consistency of the sample eigenvectors relative to their population counterparts is determined by the ratio between the dimension and the product of the sample size with the spike size. When this ratio converges to a nonzero constant, the sample eigenvector converges to a cone, with a certain angle to its corresponding population eigenvector. In the High Dimension, Low Sample Size case, the angle between the sample eigenvector and its population counterpart converges to a limiting distribution. Several generalizations of the multi-spike covariance models are also explored, and additional theoretical results are presented.

  11. High blood levels of persistent organic pollutants are statistically correlated with smoking

    DEFF Research Database (Denmark)

    Deutch, Bente; Hansen, Jens C.

    1999-01-01

    , smoking and intake of traditional Inuit food. Multiple linear regression analyses showed highly significant positive associations between the mothers' smoking status (never, previous, present) and plasma concentrations of all the studied organic pollutants both in maternal blood and umbilical cord blood......Persistent Organic Pollutants (11 pesticides and 14 PCB-congeners), and heavy metals (Cd, Cu, Hg, Pb, Se, and Zn) were determined in 175 pregnant women and 160 newborn infants (umbilical cord blood) from Disko Bay, Greenland, 1994-96. Among these, 135 women filled out questionnaires about drinking....... Traditional food and not the tobacco is known to be the source of the contaminants. But smoking may influence the enzymatic turnover of toxic substances....

  12. On Statistical Modeling of Sequencing Noise in High Depth Data to Assess Tumor Evolution

    Science.gov (United States)

    Rabadan, Raul; Bhanot, Gyan; Marsilio, Sonia; Chiorazzi, Nicholas; Pasqualucci, Laura; Khiabanian, Hossein

    2017-12-01

    One cause of cancer mortality is tumor evolution to therapy-resistant disease. First line therapy often targets the dominant clone, and drug resistance can emerge from preexisting clones that gain fitness through therapy-induced natural selection. Such mutations may be identified using targeted sequencing assays by analysis of noise in high-depth data. Here, we develop a comprehensive, unbiased model for sequencing error background. We find that noise in sufficiently deep DNA sequencing data can be approximated by aggregating negative binomial distributions. Mutations with frequencies above noise may have prognostic value. We evaluate our model with simulated exponentially expanded populations as well as data from cell line and patient sample dilution experiments, demonstrating its utility in prognosticating tumor progression. Our results may have the potential to identify significant mutations that can cause recurrence. These results are relevant in the pretreatment clinical setting to determine appropriate therapy and prepare for potential recurrence pretreatment.

  13. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  14. Polarizing a stored proton beam by spin flip? - A high statistic reanalysis

    International Nuclear Information System (INIS)

    Oellers, Dieter

    2011-01-01

    Prompted by recent, conflicting calculations, we have carried out a measurement of the spin flip cross section in low-energy electron-proton scattering. The experiment uses the cooling electron beam at COSY as an electron target. A reanalysis of the data leeds to a reduced statistical errors resulting in a factor of 4 reduced upper limit for the spin flip cross section. The measured cross sections are too small for making spin flip a viable tool in polarizing a stored beam.

  15. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermi's equation. Approximate calculus methods are found from analytic study of the T-Fermi's equation for non zero temperature. T-Fermi's equation is solved with the code ''Golem''written in Fortran V (Univac). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (author) [es

  16. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermis equation. Approximate calculus methods are found from analytic study of the T-Fermis equation for non zero temperature. T-Fermis equation is solved with the code GOLEM written in FORTRAN V (UNIVAC). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (Author) 24 refs

  17. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  18. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  19. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  20. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...

  1. High-statistics measurement of the η →3 π0 decay at the Mainz Microtron

    Science.gov (United States)

    Prakhov, S.; Abt, S.; Achenbach, P.; Adlarson, P.; Afzal, F.; Aguar-Bartolomé, P.; Ahmed, Z.; Ahrens, J.; Annand, J. R. M.; Arends, H. J.; Bantawa, K.; Bashkanov, M.; Beck, R.; Biroth, M.; Borisov, N. S.; Braghieri, A.; Briscoe, W. J.; Cherepnya, S.; Cividini, F.; Collicott, C.; Costanza, S.; Denig, A.; Dieterle, M.; Downie, E. J.; Drexler, P.; Ferretti Bondy, M. I.; Fil'kov, L. V.; Fix, A.; Gardner, S.; Garni, S.; Glazier, D. I.; Gorodnov, I.; Gradl, W.; Gurevich, G. M.; Hamill, C. B.; Heijkenskjöld, L.; Hornidge, D.; Huber, G. M.; Käser, A.; Kashevarov, V. L.; Kay, S.; Keshelashvili, I.; Kondratiev, R.; Korolija, M.; Krusche, B.; Lazarev, A.; Lisin, V.; Livingston, K.; Lutterer, S.; MacGregor, I. J. D.; Manley, D. M.; Martel, P. P.; McGeorge, J. C.; Middleton, D. G.; Miskimen, R.; Mornacchi, E.; Mushkarenkov, A.; Neganov, A.; Neiser, A.; Oberle, M.; Ostrick, M.; Otte, P. B.; Paudyal, D.; Pedroni, P.; Polonski, A.; Ron, G.; Rostomyan, T.; Sarty, A.; Sfienti, C.; Sokhoyan, V.; Spieker, K.; Steffen, O.; Strakovsky, I. I.; Strandberg, B.; Strub, Th.; Supek, I.; Thiel, A.; Thiel, M.; Thomas, A.; Unverzagt, M.; Usov, Yu. A.; Wagner, S.; Walford, N. K.; Watts, D. P.; Werthmüller, D.; Wettig, J.; Witthauer, L.; Wolfes, M.; Zana, L. A.; A2 Collaboration at MAMI

    2018-06-01

    The largest, at the moment, statistics of 7 ×106η →3 π0 decays, based on 6.2 ×107η mesons produced in the γ p →η p reaction, has been accumulated by the A2 Collaboration at the Mainz Microtron, MAMI. It allowed a detailed study of the η →3 π0 dynamics beyond its conventional parametrization with just the quadratic slope parameter α and enabled, for the first time, a measurement of the second-order term and a better understanding of the cusp structure in the neutral decay. The present data are also compared to recent theoretical calculations that predict a nonlinear dependence along the quadratic distance from the Dalitz-plot center.

  2. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    Science.gov (United States)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  3. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    Science.gov (United States)

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.

  4. A Profile of Romanian Highly Educated Eco-Consumers Interested in Product Recycling A Statistical Approach

    Directory of Open Access Journals (Sweden)

    Simionescu Mihaela

    2014-07-01

    Full Text Available The objective of this research is to create a profile of the Romanian eco-consumer with university education. The profile is not limited to the information regarding environmental and economic benefits of recycling, but focuses on ecological behaviour. A detailed statistical analysis was made based on a large representative sample of respondents with secondary and university education. Indeed, the tendency of practical ecobehaviour becomes more pronounced for the people with university education. For people that are more than 30 years old the chance of being aware of the significance of the recycling symbols on the packages decreases, the lowest chance being given to people aged more than 50. The respondents that are interested in environment protection buy products with ecological symbols. However, those people who already know the meaning of these symbols do not buy this type of products for ecological reasons, even if they are interested in the environment protection. This research also offers an extensive description of its results, being an opportunity for the respondents to know more about the meaning of the recycling symbols. The results of this research also provide information being a guideline for consumers. This study achieves two main goals: the ecological component (the eco-consumers were identified and ordinary consumers were attracted through the ecological behaviour and the economic aspect (the resources allocation will be more efficient and the marketers will be able to address ecoconsumers who have specific characteristics.

  5. Statistical fission parameters for nuclei at high excitation and angular momenta

    International Nuclear Information System (INIS)

    Blann, M.; Komoto, T.A.

    1982-01-01

    Experimental fusion/fission excitation functions are analyzed by the statistical model with modified rotating liquid drop model barriers and with single particle level densities modeled for deformation for ground state (a/sub ν/) and saddle point nuclei (a/sub f/). Values are estimated for the errors in rotating liquid drop model barriers for the different systems analyzed. These results are found to correlate well with the trends predicted by the finite range model of Krappe, Nix, and Sierk, although the discrepancies seem to be approximately 1 MeV greater than the finite range model predictions over the limited range tested. The a priori values calculated for a/sub f/ and a/sub ν/ are within +- 2% of optimum free parameter values. Analyses for barrier decrements explore the importance of collective enhancement on level densities and of nuclear deformation in calculating transmission coefficients. A calculation is performed for the 97 Rh nucleus for which a first order angular momentum scaling is used for the J = 0 finite range corrections. An excellent fit is found for the fission excitation function in this approach. Results are compared in which rotating liquid drop model barriers are decremented by a constant energy, or alternatively multiplied by a constant factor. Either parametrization is shown to be capable of satisfactorily reproducing the data although their J = 0 extrapolated values differ markedly from one another. This underscores the dangers inherent in arbitrary barrier extrapolations

  6. Statistical physics of fracture: scientific discovery through high-performance computing

    International Nuclear Information System (INIS)

    Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T

    2006-01-01

    The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness

  7. High Resolution 3D Experimental Investigation of Flow Structures and Turbulence Statistics in the Viscous and Buffer Layer

    Science.gov (United States)

    Sheng, Jian; Malkiel, Edwin; Katz, Joseph

    2006-11-01

    Digital Holographic Microscopy is implemented to perform 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. The measurements are performed at a resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50), sufficient for resolving buffer layer structures and for measuring the instantaneous wall shear stress distributions from velocity gradients in the sublayer. The data provides detailed statistics on the spatial distribution of both wall shear stress components along with the characteristic flow structures, including streamwise counter-rotating vortex pairs, multiple streamwise vortices, and rare hairpins. Conditional sampling identifies characteristic length scales of 70 wall units in spanwise and 10 wall units in wall-normal direction. In the region of high stress, the conditionally averaged flow consists of a stagnation-like sweeping motion induced by a counter rotating pair of streamwise vortices. Regions with low stress are associated with ejection motion, also generated by pairs of counter-rotating vortices. Statistics on the local strain and geometric alignment between strain and vorticity shows that the high shear generating vortices are inclined at 45 to streamwise direction, indicating that vortices are being stretched. Results of on-going analysis examines statistics of helicity, strain and impacts of near-wall structures.

  8. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  9. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    Science.gov (United States)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  10. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  11. A multi-scale and model approach to estimate future tidal high water statistics in the southern German Bright

    Science.gov (United States)

    Hein, H.; Mai, S.; Mayer, B.; Pohlmann, T.; Barjenbruch, U.

    2012-04-01

    The interactions of tides, external surges, storm surges and waves with an additional role of the coastal bathymetry define the probability of extreme water levels at the coast. Probabilistic analysis and also process based numerical models allow the estimation of future states. From the physical point of view both, deterministic processes and stochastic residuals are the fundamentals of high water statistics. This study uses a so called model chain to reproduce historic statistics of tidal high water levels (Thw) as well as the prediction of future statistics high water levels. The results of the numerical models are post-processed by a stochastic analysis. Recent studies show, that for future extrapolation of extreme Thw nonstationary parametric approaches are required. With the presented methods a better prediction of time depended parameter sets seems possible. The investigation region of this study is the southern German Bright. The model-chain is the representation of a downscaling process, which starts with an emissions scenario. Regional atmospheric and ocean models refine the results of global climate models. The concept of downscaling was chosen to resolve coastal topography sufficiently. The North Sea and estuaries are modeled with the three-dimensional model HAMburg Shelf Ocean Model. The running time includes 150 years (1950 - 2100). Results of four different hindcast runs and also of one future prediction run are validated. Based on multi-scale analysis and the theory of entropy we analyze whether any significant periodicities are represented numerically. Results show that also hindcasting the climate of Thw with a model chain for the last 60 years is a challenging task. For example, an additional modeling activity must be the inclusion of tides into regional climate ocean models. It is found that the statistics of climate variables derived from model results differs from the statistics derived from measurements. E.g. there are considerable shifts in

  12. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Zwanziger, P.

    2000-01-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfill special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfill these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes. (author)

  13. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    Directory of Open Access Journals (Sweden)

    Tamara eMelmer

    2013-04-01

    Full Text Available The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e. the spectral image properties in vertical, horizontal and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies relative to fine detail (high spatial frequencies than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian and Arabic. Results for different categories (regular text, aesthetic writing, ornamental art and fine art were similar across cultures.

  14. Statistical correlations for thermophysical properties of Supercritical Argon (SCAR) used in cooling of futuristic High Temperature Superconducting (HTS) cables

    Energy Technology Data Exchange (ETDEWEB)

    Kalsia, Mohit [School of Mechanical Engineering, Lovely Professional University, Phagwara, 144 401 (India); Dondapati, Raja Sekhar, E-mail: drsekhar@ieee.org [School of Mechanical Engineering, Lovely Professional University, Phagwara, 144 401 (India); Usurumarti, Preeti Rao [Department of Mechanical Engineering, PVK Institute of Technology, Anantpur, 515 001 (India)

    2017-05-15

    Highlights: • The developed correlations can be integrated into thermohydraulic analysis of HTS cables. • This work also explains the phenomenon of flow with less pumping power and maximum heat transfer in HTS cables. • Pumping power required to circulate the SCAR for cooling of HTS cables would be significantly lower. • For Hg-based high temperature superconductors (T{sub c} > 134 K), SCAR found to be a suitable coolant. - Abstract: High Temperature Superconducting (HTS) cables are emerging as an alternative to conventional cables in efficient power transmission. However, these HTS cables require cooling below the critical temperature of superconductors used to transmit larger currents. With the invention of high temperature superconductors whose critical temperatures are up to 134 K (Hg based), it is a great challenge to identify a suitable coolant which can carry away the heating load on the superconductors. In order to accomplish such challenge, an attempt has been made in the present work to propose supercritical Argon (SCAR) as the alternative to cool the HTS cables. Further, a statistical correlation has been developed for the thermophysical properties such as density, viscosity, specific heat and thermal conductivity of SCAR. In addition, the accuracy of developed correlations is established with the help of few statistical parameters and validated with standard database available in the literature. These temperature dependent accurate correlations are useful in predicting the pressure drop and heat transfer behaviour in HTS cables using numerical or computational techniques. In recent times, with the sophistication of computer technology, solving of various complex transport equations along with the turbulence models became popular and hence the developed correlations would benefit the technological community. It is observed that, a decrease in pressure, density and viscosity are found to be decreasing whereas the thermal conductivity and specific

  15. Statistical correlations for thermophysical properties of Supercritical Argon (SCAR) used in cooling of futuristic High Temperature Superconducting (HTS) cables

    International Nuclear Information System (INIS)

    Kalsia, Mohit; Dondapati, Raja Sekhar; Usurumarti, Preeti Rao

    2017-01-01

    Highlights: • The developed correlations can be integrated into thermohydraulic analysis of HTS cables. • This work also explains the phenomenon of flow with less pumping power and maximum heat transfer in HTS cables. • Pumping power required to circulate the SCAR for cooling of HTS cables would be significantly lower. • For Hg-based high temperature superconductors (T_c > 134 K), SCAR found to be a suitable coolant. - Abstract: High Temperature Superconducting (HTS) cables are emerging as an alternative to conventional cables in efficient power transmission. However, these HTS cables require cooling below the critical temperature of superconductors used to transmit larger currents. With the invention of high temperature superconductors whose critical temperatures are up to 134 K (Hg based), it is a great challenge to identify a suitable coolant which can carry away the heating load on the superconductors. In order to accomplish such challenge, an attempt has been made in the present work to propose supercritical Argon (SCAR) as the alternative to cool the HTS cables. Further, a statistical correlation has been developed for the thermophysical properties such as density, viscosity, specific heat and thermal conductivity of SCAR. In addition, the accuracy of developed correlations is established with the help of few statistical parameters and validated with standard database available in the literature. These temperature dependent accurate correlations are useful in predicting the pressure drop and heat transfer behaviour in HTS cables using numerical or computational techniques. In recent times, with the sophistication of computer technology, solving of various complex transport equations along with the turbulence models became popular and hence the developed correlations would benefit the technological community. It is observed that, a decrease in pressure, density and viscosity are found to be decreasing whereas the thermal conductivity and specific heat

  16. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    Science.gov (United States)

    Melmer, Tamara; Amirshahi, Seyed A.; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2013-01-01

    The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e., the spectral image properties in vertical, horizontal, and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale-invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies) relative to fine detail (high spatial frequencies) than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy) and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian, and Arabic). Results for different categories (regular text, aesthetic writing, ornamental art, and fine art) were similar across cultures. PMID:23554592

  17. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    Science.gov (United States)

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal

  18. Statistical analysis of the limitation of half integer resonances on the available momentum acceptance of the High Energy Photon Source

    Energy Technology Data Exchange (ETDEWEB)

    Jiao, Yi, E-mail: jiaoyi@ihep.ac.cn; Duan, Zhe

    2017-01-01

    In a diffraction-limited storage ring, half integer resonances can have strong effects on the beam dynamics, associated with the large detuning terms from the strong focusing and strong sextupoles as required for an ultralow emittance. In this study, the limitation of half integer resonances on the available momentum acceptance (MA) was statistically analyzed based on one design of the High Energy Photon Source (HEPS). It was found that the probability of MA reduction due to crossing of half integer resonances is closely correlated with the level of beta beats at the nominal tunes, but independent of the error sources. The analysis indicated that for the presented HEPS lattice design, the rms amplitude of beta beats should be kept below 1.5% horizontally and 2.5% vertically to reach a small MA reduction probability of about 1%.

  19. The development of mini project interactive media on junior statistical materials (developmental research in junior high school)

    Science.gov (United States)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-05-01

    Assessment is an integral part in the learning process. The process and the result should be in line, regarding to measure the ability of learners. Authentic assessment refers to a form of assessment that measures the competence of attitudes, knowledge, and skills. In fact, many teachers including mathematics teachers who have implemented curriculum based teaching 2013 feel confuse and difficult in mastering the use of authentic assessment instruments. Therefore, it is necessary to design an authentic assessment instrument with an interactive mini media project where teacher can adopt it in the assessment. The type of this research is developmental research. The developmental research refers to the 4D models development, which consist of four stages: define, design, develop and disseminate. The research purpose is to create a valid mini project interactive media on statistical materials in junior high school. The retrieved valid instrument based on expert judgment are 3,1 for eligibility constructions aspect, and 3,2 for eligibility presentation aspect, 3,25 for eligibility contents aspect, and 2,9 for eligibility didactic aspect. The research results obtained interactive mini media projects on statistical materials using Adobe Flash so it can help teachers and students in achieving learning objectives.

  20. cMiCE a high resolution animal PET using continuous LSO with a statistics based positioning scheme

    CERN Document Server

    Joung Jin Hun; Lewellen, T K

    2002-01-01

    Objective: Detector designs for small animal scanners are currently dominated by discrete crystal implementations. However, given the small crystal cross-sections required to obtain very high resolution, discrete designs are typically expensive, have low packing fraction, reduced light collection, and are labor intensive to build. To overcome these limitations we have investigated the feasibility of using a continuous miniature crystal element (cMiCE) detector module for high resolution small animal PET applications. Methods: The detector module consists of a single continuous slab of LSO, 25x25 mm sup 2 in exposed cross-section and 4 mm thick, coupled directly to a PS-PMT (Hamamatsu R5900-00-C12). The large area surfaces of the crystal were polished and painted with TiO sub 2 and the short surfaces were left unpolished and painted black. Further, a new statistics based positioning (SBP) algorithm has been implemented to address linearity and edge effect artifacts that are inherent with conventional Anger sty...

  1. Mesoscale modeling of smoke transport over Central Africa: influences of trade winds, subtropical high, ITCZ and vertical statistics

    Science.gov (United States)

    Yang, Z.; Wang, J.; Hyer, E. J.; Ichoku, C. M.

    2012-12-01

    A fully-coupled meteorology-chemistry-aerosol model, Weather Research and Forecasting model with Chemistry (WRF-Chem), is used to simulate the transport of smoke aerosol over the Central Africa during February 2008. Smoke emission used in this study is specified from the Fire Locating and Modeling of Burning Emissions (FLAMBE) database derived from Moderate Resolution Imaging Spectroradiometer (MODIS) fire products. Model performance is evaluated using MODIS true color images, measured Aerosol Optical Depth (AOD) from space-borne MODIS (550 nm) and ground-based AERONET (500 nm), and Cloud-Aerosol Lidar data with Orthogonal Polarization (CALIOP) level 1 and 2 products. The simulated smoke transport is in good agreement with the validation data. Analyzing from three smoke events, smoke is constrained in a narrow belt between the Equator and 10°N near the surface, with the interplay of trade winds, subtropical high, and ITCZ. At the 700 hpa level, smoke expands farther meridionally. Topography blocks the smoke transport to the southeast of study area, because of high mountains located near the Great Rift Valley region. The simulation with injection height of 650 m is consistent with CALIOP measurements. The particular phenomenon, aerosol above cloud, is studied statistically from CALIOP observations. The total percentage of aerosol above cloud is about 5%.

  2. Current and high-β sheets in CIR streams: statistics and interaction with the HCS and the magnetosphere

    Science.gov (United States)

    Potapov, A. S.

    2018-04-01

    Thirty events of CIR streams (corotating interaction regions between fast and slow solar wind) were analyzed in order to study statistically plasma structure within the CIR shear zones and to examine the interaction of the CIRs with the heliospheric current sheet (HCS) and the Earth's magnetosphere. The occurrence of current layers and high-beta plasma sheets in the CIR structure has been estimated. It was found that on average, each of the CIR streams had four current layers in its structure with a current density of more than 0.12 A/m2 and about one and a half high-beta plasma regions with a beta value of more than five. Then we traced how and how often the high-speed stream associated with the CIR can catch up with the heliospheric current sheet (HCS) and connect to it. The interface of each fourth CIR stream coincided in time within an hour with the HCS, but in two thirds of cases, the CIR connection with the HCS was completely absent. One event of the simultaneous observation of the CIR stream in front of the magnetosphere by the ACE satellite in the vicinity of the L1 libration point and the Wind satellite in the remote geomagnetic tail was considered in detail. Measurements of the components of the interplanetary magnetic field and plasma parameters showed that the overall structure of the stream is conserved. Moreover, some details of the fine structure are also transferred through the magnetosphere. In particular, the so-called "magnetic hole" almost does not change its shape when moving from L1 point to a neighborhood of L2 point.

  3. Statistical Optimization of Medium Compositions for High Cell Mass and Exopolysaccharide Production by Lactobacillus plantarum ATCC 8014

    Directory of Open Access Journals (Sweden)

    Nor Zalina Othman

    2018-03-01

    Full Text Available Background and Objective: Lactobacillus plantarum ATCC 8014 is known as a good producer of water soluble exopolysaccharide. Therefore, the aim of this study is to optimize the medium composition concurrently for high cell mass and exopolysaccharide production by Lactobacillus plantarum ATCC 8014. Since both are useful for food and pharmaceutical application and where most studies typically focus on one outcome only, the optimization process was carried out by using molasses as cheaper carbon source.Material and Methods: The main medium component which is known significantly give high effect on the cell mass and EPS production was selected as variables and statistically optimized based on Box-Behnken design in shake flask levels. The optimal medium for cell mass and exopolysaccharide production was composed of (in g l -1: molasses, 40; yeast extract, 16.8; phosphate, 2.72; sodium acetate, 3.98. The model was found to be significant and subsequently validated through the growth kinetics studies in un-optimized and optimized medium in the shake flask cultivation.Results and Conclusion: The maximum cell mass and exopolysaccharide in the new optimized medium was 4.40 g l-1 and 4.37 g l-1 respectively after 44 h of the cultivation. As a result, cell mass and exopolysaccharide production increased up to 4.5 and 16.5 times respectively, and the maximal exopolysaccharide yield of 1.19 per gram of cells was obtained when molasses was used as the carbon source. In conclusion, molasses has the potential to be a cheap carbon source for the cultivation of Lactobacillus plantarum ATCC 8014 concurrently for high cell mass and exopolysaccharide production.Conflict of interest: The authors declare no conflict of interest.

  4. Systematic Analysis of the Non-Extensive Statistical Approach in High Energy Particle Collisions—Experiment vs. Theory †

    Directory of Open Access Journals (Sweden)

    Gábor Bíró

    2017-02-01

    Full Text Available The analysis of high-energy particle collisions is an excellent testbed for the non-extensive statistical approach. In these reactions we are far from the thermodynamical limit. In small colliding systems, such as electron-positron or nuclear collisions, the number of particles is several orders of magnitude smaller than the Avogadro number; therefore, finite-size and fluctuation effects strongly influence the final-state one-particle energy distributions. Due to the simple characterization, the description of the identified hadron spectra with the Boltzmann–Gibbs thermodynamical approach is insufficient. These spectra can be described very well with Tsallis–Pareto distributions instead, derived from non-extensive thermodynamics. Using the q-entropy formula, we interpret the microscopic physics in terms of the Tsallis q and T parameters. In this paper we give a view on these parameters, analyzing identified hadron spectra from recent years in a wide center-of-mass energy range. We demonstrate that the fitted Tsallis-parameters show dependency on the center-of-mass energy and particle species (mass. Our findings are described well by a QCD (Quantum Chromodynamics inspired parton evolution ansatz. Based on this comprehensive study, apart from the evolution, both mesonic and baryonic components found to be non-extensive ( q > 1 , besides the mass ordered hierarchy observed in the parameter T. We also study and compare in details the theory-obtained parameters for the case of PYTHIA8 Monte Carlo Generator, perturbative QCD and quark coalescence models.

  5. High diversity of beta-lactamases in the General Hospital Vienna verified by whole genome sequencing and statistical analysis.

    Science.gov (United States)

    Barišić, Ivan; Mitteregger, Dieter; Hirschl, Alexander M; Noehammer, Christa; Wiesinger-Mayr, Herbert

    2014-10-01

    The detailed analysis of antibiotic resistance mechanisms is essential for understanding the underlying evolutionary processes, the implementation of appropriate intervention strategies and to guarantee efficient treatment options. In the present study, 110 β-lactam-resistant, clinical isolates of Enterobacteriaceae sampled in 2011 in one of Europe's largest hospitals, the General Hospital Vienna, were screened for the presence of 31 β-lactamase genes. Twenty of those isolates were selected for whole genome sequencing (WGS). In addition, the number of β-lactamase genes was estimated using biostatistical models. The carbapenemase genes blaKPC-2, blaKPC-3, and blaVIM-4 were identified in carbapenem-resistant and intermediate susceptible isolates, blaOXA-72 in an extended-spectrum β-lactamase (ESBL)-positive one. Furthermore, the observed high prevalence of the acquired blaDHA-1 and blaCMY AmpC β-lactamase genes (70%) in phenotypically AmpC-positive isolates is alarming due to their capability to become carbapenem-resistant upon changes in membrane permeability. The statistical analyses revealed that approximately 55% of all β-lactamase genes present in the General Hospital Vienna were detected by this study. In summary, this work gives a very detailed picture on the disseminated β-lactamases and other resistance genes in one of Europe's largest hospitals. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. cMiCE: a high resolution animal PET using continuous LSO with a statistics based positioning scheme

    International Nuclear Information System (INIS)

    Joung Jinhun; Miyaoka, R.S.; Lewellen, T.K.

    2002-01-01

    Objective: Detector designs for small animal scanners are currently dominated by discrete crystal implementations. However, given the small crystal cross-sections required to obtain very high resolution, discrete designs are typically expensive, have low packing fraction, reduced light collection, and are labor intensive to build. To overcome these limitations we have investigated the feasibility of using a continuous miniature crystal element (cMiCE) detector module for high resolution small animal PET applications. Methods: The detector module consists of a single continuous slab of LSO, 25x25 mm 2 in exposed cross-section and 4 mm thick, coupled directly to a PS-PMT (Hamamatsu R5900-00-C12). The large area surfaces of the crystal were polished and painted with TiO 2 and the short surfaces were left unpolished and painted black. Further, a new statistics based positioning (SBP) algorithm has been implemented to address linearity and edge effect artifacts that are inherent with conventional Anger style positioning schemes. To characterize the light response function (LRF) of the detector, data were collected on a coarse grid using a highly collimated coincidence setup. The LRF was then estimated using cubic spline interpolation. Detector performance has been evaluated for both SBP and Anger based decoding using measured data and Monte Carlo simulations. Results: Using the SBP scheme, edge artifacts were successfully handled. Simulation results show that the useful field of view (UFOV) was extended to ∼22x22 mm 2 with an average point spread function of ∼0.5 mm full width of half maximum (FWHM PSF ). For the same detector with Anger decoding the UFOV of the detector was ∼16x16 mm 2 with an average FWHM PSP of ∼0.9 mm. Experimental results yielded similar differences between FOV and resolution performance. FWHM PSF for the SBP and Anger based method was 1.4 and 2.0 mm, uncorrected for source size, with a 1 mm diameter point source, respectively. Conclusion

  7. Estimating summary statistics for electronic health record laboratory data for use in high-throughput phenotyping algorithms

    Science.gov (United States)

    Elhadad, N.; Claassen, J.; Perotte, R.; Goldstein, A.; Hripcsak, G.

    2018-01-01

    We study the question of how to represent or summarize raw laboratory data taken from an electronic health record (EHR) using parametric model selection to reduce or cope with biases induced through clinical care. It has been previously demonstrated that the health care process (Hripcsak and Albers, 2012, 2013), as defined by measurement context (Hripcsak and Albers, 2013; Albers et al., 2012) and measurement patterns (Albers and Hripcsak, 2010, 2012), can influence how EHR data are distributed statistically (Kohane and Weber, 2013; Pivovarov et al., 2014). We construct an algorithm, PopKLD, which is based on information criterion model selection (Burnham and Anderson, 2002; Claeskens and Hjort, 2008), is intended to reduce and cope with health care process biases and to produce an intuitively understandable continuous summary. The PopKLD algorithm can be automated and is designed to be applicable in high-throughput settings; for example, the output of the PopKLD algorithm can be used as input for phenotyping algorithms. Moreover, we develop the PopKLD-CAT algorithm that transforms the continuous PopKLD summary into a categorical summary useful for applications that require categorical data such as topic modeling. We evaluate our methodology in two ways. First, we apply the method to laboratory data collected in two different health care contexts, primary versus intensive care. We show that the PopKLD preserves known physiologic features in the data that are lost when summarizing the data using more common laboratory data summaries such as mean and standard deviation. Second, for three disease-laboratory measurement pairs, we perform a phenotyping task: we use the PopKLD and PopKLD-CAT algorithms to define high and low values of the laboratory variable that are used for defining a disease state. We then compare the relationship between the PopKLD-CAT summary disease predictions and the same predictions using empirically estimated mean and standard deviation to a

  8. Estimating summary statistics for electronic health record laboratory data for use in high-throughput phenotyping algorithms.

    Science.gov (United States)

    Albers, D J; Elhadad, N; Claassen, J; Perotte, R; Goldstein, A; Hripcsak, G

    2018-02-01

    We study the question of how to represent or summarize raw laboratory data taken from an electronic health record (EHR) using parametric model selection to reduce or cope with biases induced through clinical care. It has been previously demonstrated that the health care process (Hripcsak and Albers, 2012, 2013), as defined by measurement context (Hripcsak and Albers, 2013; Albers et al., 2012) and measurement patterns (Albers and Hripcsak, 2010, 2012), can influence how EHR data are distributed statistically (Kohane and Weber, 2013; Pivovarov et al., 2014). We construct an algorithm, PopKLD, which is based on information criterion model selection (Burnham and Anderson, 2002; Claeskens and Hjort, 2008), is intended to reduce and cope with health care process biases and to produce an intuitively understandable continuous summary. The PopKLD algorithm can be automated and is designed to be applicable in high-throughput settings; for example, the output of the PopKLD algorithm can be used as input for phenotyping algorithms. Moreover, we develop the PopKLD-CAT algorithm that transforms the continuous PopKLD summary into a categorical summary useful for applications that require categorical data such as topic modeling. We evaluate our methodology in two ways. First, we apply the method to laboratory data collected in two different health care contexts, primary versus intensive care. We show that the PopKLD preserves known physiologic features in the data that are lost when summarizing the data using more common laboratory data summaries such as mean and standard deviation. Second, for three disease-laboratory measurement pairs, we perform a phenotyping task: we use the PopKLD and PopKLD-CAT algorithms to define high and low values of the laboratory variable that are used for defining a disease state. We then compare the relationship between the PopKLD-CAT summary disease predictions and the same predictions using empirically estimated mean and standard deviation to a

  9. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    Science.gov (United States)

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  10. High-resolution marine flood modelling coupling overflow and overtopping processes: framing the hazard based on historical and statistical approaches

    Science.gov (United States)

    Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo

    2018-01-01

    A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard

  11. High Energy $\

    CERN Multimedia

    2002-01-01

    This experiment is a high statistics exposure of BEBC filled with hydrogen to both @n and &bar.@n beams. The principal physics aims are : \\item a) The study of the production of charmed mesons and baryons using fully constrained events. \\end{enumerate} b) The study of neutral current interactions on the free proton. \\item c) Measurement of the cross-sections for production of exclusive final state N* and @D resonances. \\item d) Studies of hadronic final states in charged and neutral current reactions. \\item e) Measurement of inclusive charged current cross-sections and structure functions. \\end{enumerate}\\\\ \\\\ The neutrino flux is determined by monitoring the flux of muons in the neutrino shield. The Internal Picket Fence and External Muon Identifier of BEBC are essential parts of the experiment. High resolution cameras are used to search for visible decays of short-lived particles.

  12. Statistical removal of background signals from high-throughput 1H NMR line-broadening ligand-affinity screens

    International Nuclear Information System (INIS)

    Worley, Bradley; Sisco, Nicholas J.; Powers, Robert

    2015-01-01

    NMR ligand-affinity screens are vital to drug discovery, are routinely used to screen fragment-based libraries, and used to verify chemical leads from high-throughput assays and virtual screens. NMR ligand-affinity screens are also a highly informative first step towards identifying functional epitopes of unknown proteins, as well as elucidating the biochemical functions of protein–ligand interaction at their binding interfaces. While simple one-dimensional 1 H NMR experiments are capable of indicating binding through a change in ligand line shape, they are plagued by broad, ill-defined background signals from protein 1 H resonances. We present an uncomplicated method for subtraction of protein background in high-throughput ligand-based affinity screens, and show that its performance is maximized when phase-scatter correction is applied prior to subtraction

  13. Use Of Statistical Tools To Evaluate The Reductive Dechlorination Of High Levels Of TCE In Microcosm Studies

    Science.gov (United States)

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...

  14. Science Achievement and Occupational Career/Technical Education Coursetaking in High School: The Class of 2005. Statistics in Brief. NCES 2010-021

    Science.gov (United States)

    Levesque, Karen; Wun, Jolene; Green, Caitlin

    2010-01-01

    The definition of CTE (career/technical education) used by the National Center for Education Statistics (NCES) includes, at the high school level, family and consumer sciences education, general labor market preparation, and occupational education (Bradby and Hoachlander 1999; Bradby and Hudson 2007). Most researchers focus on occupational…

  15. Factors That Explain the Attitude towards Statistics in High-School Students: Empirical Evidence at Technological Study Center of the Sea in Veracruz, Mexico

    Science.gov (United States)

    Rojas-Kramer, Carlos; Limón-Suárez, Enrique; Moreno-García, Elena; García-Santillán, Arturo

    2018-01-01

    The aim of this paper was to analyze attitude towards statistics in high-school students using the SATS scale designed by Auzmendi (1992). The sample was 200 students from the sixth semester of the afternoon shift, who were enrolled in technical careers from the Technological Study Center of the Sea (Centro de Estudios Tecnológicos del Mar 07…

  16. A method for analyzing low statistics high resolution spectra from 210Pb in underground coal miners from Brazil

    International Nuclear Information System (INIS)

    Dantas, A.L.A.; Dantas, B.M.; Lipsztein, J.L.; Spitz, H.B.

    2006-01-01

    A survey conducted by the IRD-CNEN determined that some workers from an underground coal mine in the south of Brazil were exposed to elevated airborne concentrations of 222 Rn. Because inhalation of high airborne concentrations of 222 Rn can lead to an increase of 210 Pb in bone, in vivo measurements of 210 Pb in the skeleton were performed in selected underground workers from this mine. Measurements were performed using an array of high-resolution germanium detectors positioned around the head and knee to detect the low abundant 46.5 keV photon emitted by 210 Pb. The gamma-ray spectra were analyzed using a moving median smoothing function to detect the presence of a photopeak at 46.5 keV. The minimum detectable activity of 210 Pb in the skeleton using this methodology was 50 Bq. (author)

  17. Network based on statistical multiplexing for event selection and event builder systems in high energy physics experiments

    International Nuclear Information System (INIS)

    Calvet, D.

    2000-03-01

    Systems for on-line event selection in future high energy physics experiments will use advanced distributed computing techniques and will need high speed networks. After a brief description of projects at the Large Hadron Collider, the architectures initially proposed for the Trigger and Data AcQuisition (TD/DAQ) systems of ATLAS and CMS experiments are presented and analyzed. A new architecture for the ATLAS T/DAQ is introduced. Candidate network technologies for this system are described. This thesis focuses on ATM. A variety of network structures and topologies suited to partial and full event building are investigated. The need for efficient networking is shown. Optimization techniques for high speed messaging and their implementation on ATM components are described. Small scale demonstrator systems consisting of up to 48 computers (∼1:20 of the final level 2 trigger) connected via ATM are described. Performance results are presented. Extrapolation of measurements and evaluation of needs lead to a proposal of implementation for the main network of the ATLAS T/DAQ system. (author)

  18. High-order harmonics measured by the photon statistics of the infrared driving-field exiting the atomic medium.

    Science.gov (United States)

    Tsatrafyllis, N; Kominis, I K; Gonoskov, I A; Tzallas, P

    2017-04-27

    High-order harmonics in the extreme-ultraviolet spectral range, resulting from the strong-field laser-atom interaction, have been used in a broad range of fascinating applications in all states of matter. In the majority of these studies the harmonic generation process is described using semi-classical theories which treat the electromagnetic field of the driving laser pulse classically without taking into account its quantum nature. In addition, for the measurement of the generated harmonics, all the experiments require diagnostics in the extreme-ultraviolet spectral region. Here by treating the driving laser field quantum mechanically we reveal the quantum-optical nature of the high-order harmonic generation process by measuring the photon number distribution of the infrared light exiting the harmonic generation medium. It is found that the high-order harmonics are imprinted in the photon number distribution of the infrared light and can be recorded without the need of a spectrometer in the extreme-ultraviolet.

  19. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2012-01-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework

  20. Crop Yield Predictions - High Resolution Statistical Model for Intra-season Forecasts Applied to Corn in the US

    Science.gov (United States)

    Cai, Y.

    2017-12-01

    Accurately forecasting crop yields has broad implications for economic trading, food production monitoring, and global food security. However, the variation of environmental variables presents challenges to model yields accurately, especially when the lack of highly accurate measurements creates difficulties in creating models that can succeed across space and time. In 2016, we developed a sequence of machine-learning based models forecasting end-of-season corn yields for the US at both the county and national levels. We combined machine learning algorithms in a hierarchical way, and used an understanding of physiological processes in temporal feature selection, to achieve high precision in our intra-season forecasts, including in very anomalous seasons. During the live run, we predicted the national corn yield within 1.40% of the final USDA number as early as August. In the backtesting of the 2000-2015 period, our model predicts national yield within 2.69% of the actual yield on average already by mid-August. At the county level, our model predicts 77% of the variation in final yield using data through the beginning of August and improves to 80% by the beginning of October, with the percentage of counties predicted within 10% of the average yield increasing from 68% to 73%. Further, the lowest errors are in the most significant producing regions, resulting in very high precision national-level forecasts. In addition, we identify the changes of important variables throughout the season, specifically early-season land surface temperature, and mid-season land surface temperature and vegetation index. For the 2017 season, we feed 2016 data to the training set, together with additional geospatial data sources, aiming to make the current model even more precise. We will show how our 2017 US corn yield forecasts converges in time, which factors affect the yield the most, as well as present our plans for 2018 model adjustments.

  1. Counting statistics of transport through Coulomb blockade nanostructures: High-order cumulants and non-Markovian effects

    DEFF Research Database (Denmark)

    Flindt, Christian; Novotny, Tomás; Braggio, Alessandro

    2010-01-01

    Recent experimental progress has made it possible to detect in real-time single electrons tunneling through Coulomb blockade nanostructures, thereby allowing for precise measurements of the statistical distribution of the number of transferred charges, the so-called full counting statistics...... interactions. Our recursive method can treat systems with many states as well as non-Markovian dynamics. We illustrate our approach with three examples of current experimental relevance: bunching transport through a two-level quantum dot, transport through a nanoelectromechanical system with dynamical Franck...

  2. Statistical analysis for discrimination of prompt gamma ray peak induced by high energy neutron: Monte Carlo simulation study

    International Nuclear Information System (INIS)

    Do-Kun Yoon; Joo-Young Jung; Tae Suk Suh; Seong-Min Han

    2015-01-01

    The purpose of this research is a statistical analysis for discrimination of prompt gamma ray peak induced by the 14.1 MeV neutron particles from spectra using Monte Carlo simulation. For the simulation, the information of 18 detector materials was used to simulate spectra by the neutron capture reaction. The discrimination of nine prompt gamma ray peaks from the simulation of each detector material was performed. We presented the several comparison indexes of energy resolution performance depending on the detector material using the simulation and statistics for the prompt gamma activation analysis. (author)

  3. Error statistics in a high-speed fibreoptic communication line with a phase shift of odd bits

    International Nuclear Information System (INIS)

    Shapiro, Elena G

    2009-01-01

    The propagation of optical pulses through a fibreoptic communication line with a phase shift of odd bits is directly numerically simulated. It is shown that simple analytic expressions approximate well the error probability. The phase shift of odd bits in the initial sequence is statistically shown to decrease significantly the error probability in the communication line. (fibreoptic communication lines)

  4. Statistical mechanics of light elements at high pressure. VII. A perturbative free energy for arbitrary mixtures of H and He

    International Nuclear Information System (INIS)

    Hubbard, W.B.; Dewitt, H.E.

    1985-01-01

    A model free energy is presented which accurately represents results from 45 high-precision Monte Carlo calculations of the thermodynamics of hydrogen-helium mixtures at pressures of astrophysical and planetophysical interest. The free energy is calculated using free-electron perturbation theory (dielectric function theory), and is an extension of the expression given in an earlier paper in this series. However, it fits the Monte Carlo results more accurately, and is valid for the full range of compositions from pure hydrogen to pure helium. Using the new free energy, the phase diagram of mixtures of liquid metallic hydrogen and helium is calculated and compared with earlier results. Sample results for mixing volumes are also presented, and the new free energy expression is used to compute a theoretical Jovian adiabat and compare the adiabat with results from three-dimensional Thomas-Fermi-Dirac theory. The present theory gives slightly higher densities at pressures of about 10 megabars. 20 references

  5. Statistical mechanics of light elements at high pressure. VII - A perturbative free energy for arbitrary mixtures of H and He

    Science.gov (United States)

    Hubbard, W. B.; Dewitt, H. E.

    1985-01-01

    A model free energy is presented which accurately represents results from 45 high-precision Monte Carlo calculations of the thermodynamics of hydrogen-helium mixtures at pressures of astrophysical and planetophysical interest. The free energy is calculated using free-electron perturbation theory (dielectric function theory), and is an extension of the expression given in an earlier paper in this series. However, it fits the Monte Carlo results more accurately, and is valid for the full range of compositions from pure hydrogen to pure helium. Using the new free energy, the phase diagram of mixtures of liquid metallic hydrogen and helium is calculated and compared with earlier results. Sample results for mixing volumes are also presented, and the new free energy expression is used to compute a theoretical Jovian adiabat and compare the adiabat with results from three-dimensional Thomas-Fermi-Dirac theory. The present theory gives slightly higher densities at pressures of about 10 megabars.

  6. Low and High Frequency Models of Response Statistics of a Cylindrical Orthogrid Vehicle Panel to Acoustic Excitation

    Science.gov (United States)

    Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent

    2010-01-01

    This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense

  7. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose ... Day in the Life of Diabetes Famous People Working to Stop Diabetes Common Terms Diabetes Statistics Infographics ...

  8. High speed data acquisition

    International Nuclear Information System (INIS)

    Cooper, P.S.

    1997-07-01

    A general introduction to high speed data acquisition system techniques in modern particle physics experiments is given. Examples are drawn from the SELEX(E78 1) high statistics charmed baryon production and decay experiment now taking data at Fermilab

  9. A statistical approach to rank multiple priorities in Environmental Epidemiology: an example from high-risk areas in Sardinia, Italy

    Directory of Open Access Journals (Sweden)

    Dolores Catelan

    2008-11-01

    Full Text Available In Environmental Epidemiology, long lists of relative risk estimates from exposed populations are compared to a reference to scrutinize the dataset for extremes. Here, inference on disease profiles for given areas, or for fixed disease population signatures, are of interest and summaries can be obtained averaging over areas or diseases. We have developed a multivariate hierarchical Bayesian approach to estimate posterior rank distributions and we show how to produce league tables of ranks with credibility intervals useful to address the above mentioned inferential problems. Applying the procedure to a real dataset from the report “Environment and Health in Sardinia (Italy” we selected 18 areas characterized by high environmental pressure for industrial, mining or military activities investigated for 29 causes of deaths among male residents. Ranking diseases highlighted the increased burdens of neoplastic (cancerous, and non-neoplastic respiratory diseases in the heavily polluted area of Portoscuso. The averaged ranks by disease over areas showed lung cancer among the three highest positions.

  10. Dark Matter Profiles in Dwarf Galaxies: A Statistical Sample Using High-Resolution Hα Velocity Fields from PCWI

    Science.gov (United States)

    Relatores, Nicole C.; Newman, Andrew B.; Simon, Joshua D.; Ellis, Richard; Truong, Phuongmai N.; Blitz, Leo

    2018-01-01

    We present high quality Hα velocity fields for a sample of nearby dwarf galaxies (log M/M⊙ = 8.4-9.8) obtained as part of the Dark Matter in Dwarf Galaxies survey. The purpose of the survey is to investigate the cusp-core discrepancy by quantifying the variation of the inner slope of the dark matter distributions of 26 dwarf galaxies, which were selected as likely to have regular kinematics. The data were obtained with the Palomar Cosmic Web Imager, located on the Hale 5m telescope. We extract rotation curves from the velocity fields and use optical and infrared photometry to model the stellar mass distribution. We model the total mass distribution as the sum of a generalized Navarro-Frenk-White dark matter halo along with the stellar and gaseous components. We present the distribution of inner dark matter density profile slopes derived from this analysis. For a subset of galaxies, we compare our results to an independent analysis based on CO observations. In future work, we will compare the scatter in inner density slopes, as well as their correlations with galaxy properties, to theoretical predictions for dark matter core creation via supernovae feedback.

  11. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    Science.gov (United States)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  12. A high-statistics measurement of the pp→nn charge-exchange reaction at 875 MeV/c

    International Nuclear Information System (INIS)

    Lamanna, M.; Ahmidouch, A.; Birsa, R.; Bradamante, F.; Bressan, A.; Bressani, T.; Dalla Torre-Colautti, S.; Giorgi, M.; Heer, E.; Hess, R.; Kunne, R.A.; Lechanoine-Le Luc, C.; Martin, A.; Mascarini, C.; Masoni, A.; Penzo, A.; Rapin, D.; Schiavon, P.; Tessarotto, F.

    1995-01-01

    A new measurement of the differential cross section and of the analysing power A 0n of the charge-exchange reaction pp→nn at 875 MeV/c is presented. The A 0n data cover the entire angular range and constitute a considerable improvement over previously published data, both in the forward and in the backward hemisphere. The cross-section data cover only the backward region, but are unique at this energy. A careful study of the long-term drifts of the apparatus has allowed to fully exploit the good statistics of the data. ((orig.))

  13. A statistical analysis of seeds and other high-contrast exoplanet surveys: massive planets or low-mass brown dwarfs?

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Timothy D.; Spiegel, David S. [Institute for Advanced Study, Princeton, NJ (United States); McElwain, Michael W.; Grady, C. A. [Exoplanets and Stellar Astrophysics Laboratory, Goddard Space Flight Center, Greenbelt, MD (United States); Turner, Edwin L. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ (United States); Mede, Kyle; Kuzuhara, Masayuki [University of Tokyo, Tokyo (Japan); Schlieder, Joshua E.; Brandner, W.; Feldt, M. [Max Planck Institute for Astronomy, Heidelberg (Germany); Wisniewski, John P. [HL Dodge Department of Physics and Astronomy, University of Oklahoma, Norman, OK (United States); Abe, L. [Laboratoire Hippolyte Fizeau, Nice (France); Biller, B. [University of Edinburgh, Edinburgh, Scotland (United Kingdom); Carson, J. [College of Charleston, Charleston, SC (United States); Currie, T. [Department of Astronomy and Astrophysics, University of Toronto, Toronto, ON (Canada); Egner, S.; Golota, T.; Guyon, O. [Subaru Telescope, Hilo, Hawai' i (United States); Goto, M. [Universitäts-Sternwarte München, Munich (Germany); Hashimoto, J. [National Astronomical Observatory of Japan, Tokyo (Japan); and others

    2014-10-20

    We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (κ And b, two ∼60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD–35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ∼30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ∼5 M {sub J}, with a single power-law distribution. We find that p(M, a)∝M {sup –0.65} {sup ±} {sup 0.60} a {sup –0.85} {sup ±} {sup 0.39} (1σ errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.

  14. Conversion electrons from high-statistics β-decay measurements with the 8π spectrometer at TRIUMF-ISAC

    Science.gov (United States)

    Garrett, P. E.; Jigmeddorj, B.; Radich, A. J.; Andreoiu, C.; Ball, G. C.; Bangay, J. C.; Bianco, L.; Bildstein, V.; Chagnon-Lessard, S.; Cross, D. S.; Demand, G. A.; Diaz Varela, A.; Dunlop, R.; Finlay, P.; Garnsworthy, A. B.; Green, K. L.; Hackman, G.; Hadinia, B.; Leach, K. G.; Michetti-Wilson, J.; Orce, J. N.; Rajabali, M. M.; Rand, E. T.; Starosta, K.; Sumithrarachchi, C.; Svensson, C. E.; Triambak, S.; Wang, Z. M.; Williams, S. J.; Wood, J. L.; Wong, J.; Yates, S. W.; Zganjar, E. F.

    2016-09-01

    The 8π spectrometer, located at TRIUMF-ISAC, was the world's most powerful spectrometer dedicated to β-decay studies until its decommissioning in early 2014 for replacement with the GRIFFIN array. An integral part of the 8π spectrometer was the Pentagonal Array for Conversion Electron Spectroscopy (PACES) consisting of 5 Si(Li) detectors used for charged-particle detection. PACES enabled both γ - e- and e- - e- coincidence measurements, which were crucial for increasing the sensitivity for discrete e- lines in the presence of large backgrounds. Examples from a 124Cs decay experiment, where the data were vital for the expansion of the 124Cs decay scheme, are shown. With suffcient statistics, measurements of conversion coeffcients can be used to extract the E0 components of Jπ → Jπ transitions for J ≠ 0, which is demonstrated for data obtained in 110In→110Cd decay. With knowledge of the shapes of the states involved, as obtained, for example, from the use of Kumar-Cline shape invariants, the mixing of the states can be extracted.

  15. High productivity chromatography refolding process for Hepatitis B Virus X (HBx) protein guided by statistical design of experiment studies.

    Science.gov (United States)

    Basu, Anindya; Leong, Susanna Su Jan

    2012-02-03

    The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Influence of ASIR (Adaptative Statistical Iterative Reconstruction) variation in the image noise of computerized tomography for high voltage

    International Nuclear Information System (INIS)

    Mendes, L.M.M.; Pereira, W.B.R.; Vieira, J.G.; Lamounier, C.S.; Gonçalves, D.A.; Carvalho, G.N.P.; Santana, P.C.; Oliveira, P.M.C.; Reis, L.P.

    2017-01-01

    Computed tomography had great advances in the equipment used in the diagnostic practice, directly influencing the levels of radiation for the patient. It is essential to optimize techniques that must be employed to comply with the ALARA (As Low As Reasonably Achievable) principle of radioprotection. The relationship of ASIR (Adaptive Statistical Iterative Reconstruction) with image noise was studied. Central images of a homogeneous water simulator were obtained in a 20 mm scan using a 64-channel Lightspeed VCT tomograph of General Electric in helical acquisitions with a rotation time of 0.5 seconds, Pitch 0.984: 1, and thickness of cut 0.625 mm. All these constant parameters varying the voltage in two distinct values: 120 and 140 kV with use of the automatic current by the CAE (Automatic Exposure Control), ranging from 50 to 675 mA (120 kV) and from 50 to 610 mA (140kV), minimum and maximum values, respectively allowed for each voltage. Image noise was determined through ImageJ free software. The analysis of the obtained data compared the percentage variation of the noise in the image based on the ASIR value of 10%, concluding that there is a variation of approximately 50% when compared to the values of ASIR (100%) in both tensions. Dose evaluation is required in future studies to better utilize the relationship between dose and image quality

  17. Development of an Urban High-Resolution Air Temperature Forecast System for Local Weather Information Services Based on Statistical Downscaling

    Directory of Open Access Journals (Sweden)

    Chaeyeon Yi

    2018-04-01

    Full Text Available The Korean peninsula has complex and diverse weather phenomena, and the Korea Meteorological Administration has been working on various numerical models to produce better forecasting data. The Unified Model Local Data Assimilation and Prediction System is a limited-area working model with a horizontal resolution of 1.5 km for estimating local-scale weather forecasts on the Korean peninsula. However, in order to numerically predict the detailed temperature characteristics of the urban space, in which surface characteristics change rapidly in a small spatial area, a city temperature prediction model with higher resolution spatial decomposition capabilities is required. As an alternative to this, a building-scale temperature model was developed, and a 25 m air temperature resolution was determined for the Seoul area. The spatial information was processed using statistical methods, such as linear regression models and machine learning. By comparing the accuracy of the estimated air temperatures with observational data during the summer, the machine learning was improved. In addition, horizontal and vertical characteristics of the urban space were better represented, and the air temperature was better resolved spatially. Air temperature information can be used to manage the response to heat-waves and tropical nights in administrative districts of urban areas.

  18. High PRF high current switch

    Science.gov (United States)

    Moran, Stuart L.; Hutcherson, R. Kenneth

    1990-03-27

    A triggerable, high voltage, high current, spark gap switch for use in pu power systems. The device comprises a pair of electrodes in a high pressure hydrogen environment that is triggered by introducing an arc between one electrode and a trigger pin. Unusually high repetition rates may be obtained by undervolting the switch, i.e., operating the trigger at voltages much below the self-breakdown voltage of the device.

  19. Screening of Ganoderma strains with high polysaccharides and ganoderic acid contents and optimization of the fermentation medium by statistical methods.

    Science.gov (United States)

    Wei, Zhen-hua; Duan, Ying-yi; Qian, Yong-qing; Guo, Xiao-feng; Li, Yan-jun; Jin, Shi-he; Zhou, Zhong-Xin; Shan, Sheng-yan; Wang, Chun-ru; Chen, Xue-Jiao; Zheng, Yuguo; Zhong, Jian-Jiang

    2014-09-01

    Polysaccharides and ganoderic acids (GAs) are the major bioactive constituents of Ganoderma species. However, the commercialization of their production was limited by low yield in the submerged culture of Ganoderma despite improvement made in recent years. In this work, twelve Ganoderma strains were screened to efficiently produce polysaccharides and GAs, and Ganoderma lucidum 5.26 (GL 5.26) that had been never reported in fermentation process was found to be most efficient among the tested stains. Then, the fermentation medium was optimized for GL 5.26 by statistical method. Firstly, glucose and yeast extract were found to be the optimum carbon source and nitrogen source according to the single-factor tests. Ferric sulfate was found to have significant effect on GL 5.26 biomass production according to the results of Plackett-Burman design. The concentrations of glucose, yeast extract and ferric sulfate were further optimized by response surface methodology. The optimum medium composition was 55 g/L of glucose, 14 g/L of yeast extract, 0.3 g/L of ferric acid, with other medium components unchanged. The optimized medium was testified in the 10-L bioreactor, and the production of biomass, IPS, total GAs and GA-T enhanced by 85, 27, 49 and 93 %, respectively, compared to the initial medium. The fermentation process was scaled up to 300-L bioreactor; it showed good IPS (3.6 g/L) and GAs (670 mg/L) production. The biomass was 23.9 g/L in 300-L bioreactor, which was the highest biomass production in pilot scale. According to this study, the strain GL 5.26 showed good fermentation property by optimizing the medium. It might be a candidate industrial strain by further process optimization and scale-up study.

  20. Built-Up Area Detection from High-Resolution Satellite Images Using Multi-Scale Wavelet Transform and Local Spatial Statistics

    Science.gov (United States)

    Chen, Y.; Zhang, Y.; Gao, J.; Yuan, Y.; Lv, Z.

    2018-04-01

    Recently, built-up area detection from high-resolution satellite images (HRSI) has attracted increasing attention because HRSI can provide more detailed object information. In this paper, multi-resolution wavelet transform and local spatial autocorrelation statistic are introduced to model the spatial patterns of built-up areas. First, the input image is decomposed into high- and low-frequency subbands by wavelet transform at three levels. Then the high-frequency detail information in three directions (horizontal, vertical and diagonal) are extracted followed by a maximization operation to integrate the information in all directions. Afterward, a cross-scale operation is implemented to fuse different levels of information. Finally, local spatial autocorrelation statistic is introduced to enhance the saliency of built-up features and an adaptive threshold algorithm is used to achieve the detection of built-up areas. Experiments are conducted on ZY-3 and Quickbird panchromatic satellite images, and the results show that the proposed method is very effective for built-up area detection.

  1. KiDS-450: cosmological constraints from weak lensing peak statistics - I. Inference from analytical prediction of high signal-to-noise ratio convergence peaks

    Science.gov (United States)

    Shan, HuanYuan; Liu, Xiangkun; Hildebrandt, Hendrik; Pan, Chuzhong; Martinet, Nicolas; Fan, Zuhui; Schneider, Peter; Asgari, Marika; Harnois-Déraps, Joachim; Hoekstra, Henk; Wright, Angus; Dietrich, Jörg P.; Erben, Thomas; Getman, Fedor; Grado, Aniello; Heymans, Catherine; Klaes, Dominik; Kuijken, Konrad; Merten, Julian; Puddu, Emanuella; Radovich, Mario; Wang, Qiao

    2018-02-01

    This paper is the first of a series of papers constraining cosmological parameters with weak lensing peak statistics using ˜ 450 deg2 of imaging data from the Kilo Degree Survey (KiDS-450). We measure high signal-to-noise ratio (SNR: ν) weak lensing convergence peaks in the range of 3 < ν < 5, and employ theoretical models to derive expected values. These models are validated using a suite of simulations. We take into account two major systematic effects, the boost factor and the effect of baryons on the mass-concentration relation of dark matter haloes. In addition, we investigate the impacts of other potential astrophysical systematics including the projection effects of large-scale structures, intrinsic galaxy alignments, as well as residual measurement uncertainties in the shear and redshift calibration. Assuming a flat Λ cold dark matter model, we find constraints for S_8=σ _8(Ω _m/0.3)^{0.5}=0.746^{+0.046}_{-0.107} according to the degeneracy direction of the cosmic shear analysis and Σ _8=σ _8(Ω _m/0.3)^{0.38}=0.696^{+0.048}_{-0.050} based on the derived degeneracy direction of our high-SNR peak statistics. The difference between the power index of S8 and in Σ8 indicates that combining cosmic shear with peak statistics has the potential to break the degeneracy in σ8 and Ωm. Our results are consistent with the cosmic shear tomographic correlation analysis of the same data set and ˜2σ lower than the Planck 2016 results.

  2. A statistical approach to the use of control entropy identifies differences in constraints of gait in highly trained versus untrained runners.

    Science.gov (United States)

    Parshad, Rana D; McGregor, Stephen J; Busa, Michael A; Skufca, Joseph D; Bollt, Erik

    2012-01-01

    Control entropy (CE) is a complexity analysis suitable for dynamic, non-stationary conditions which allows the inference of the control effort of a dynamical system generating the signal. These characteristics make CE a highly relevant time varying quantity relevant to the dynamic physiological responses associated with running. Using High Resolution Accelerometry (HRA) signals we evaluate here constraints of running gait, from two different groups of runners, highly trained collegiate and untrained runners. To this end,we further develop the control entropy (CE) statistic to allow for group analysis to examine the non-linear characteristics of movement patterns in highly trained runners with those of untrained runners, to gain insight regarding gaits that are optimal for running. Specifically, CE develops response time series of individuals descriptive of the control effort; a group analysis of these shapes developed here uses Karhunen Loeve Analysis (KL) modes of these time series which are compared between groups by application of a Hotelling T² test to these group response shapes. We find that differences in the shape of the CE response exist within groups, between axes for untrained runners (vertical vs anterior-posterior and mediolateral vs anterior-posterior) and trained runners (mediolateral vs anterior-posterior). Also shape differences exist between groups by axes (vertical vs mediolateral). Further, the CE, as a whole, was higher in each axis in trained vs untrained runners. These results indicate that the approach can provide unique insight regarding the differing constraints on running gait in highly trained and untrained runners when running under dynamic conditions. Further, the final point indicates trained runners are less constrained than untrained runners across all running speeds.

  3. A Statistical study of the Doppler spectral width of high-latitude ionospheric F-region echoes recorded with SuperDARN coherent HF radars

    Directory of Open Access Journals (Sweden)

    J.-P. Villain

    2002-11-01

    Full Text Available The HF radars of the Super Dual Auroral Radar Network (SuperDARN provide measurements of the E × B drift of ionospheric plasma over extended regions of the high-latitude ionosphere. We have conducted a statistical study of the associated Doppler spectral width of ionospheric F-region echoes. The study has been conducted with all available radars from the Northern Hemisphere for 2 specific periods of time. Period 1 corresponds to the winter months of 1994, while period 2 covers October 1996 to March 1997. The distributions of data points and average spectral width are presented as a function of Magnetic Latitude and Magnetic Local Time. The databases are very consistent and exhibit the same features. The most stringent features are: a region of very high spectral width, collocated with the ionospheric LLBL/cusp/mantle region; an oval shaped region of high spectral width, whose equator-ward boundary matches the poleward limit of the Holzworth and Meng auroral oval. A simulation has been conducted to evaluate the geometrical and instrumental effects on the spectral width. It shows that these effects cannot account for the observed spectral features. It is then concluded that these specific spectral width characteristics are the signature of ionospheric/magnetospheric coupling phenomena.Key words. Ionosphere (auroral ionosphere; ionosphere-magnetosphere interactions; ionospheric irregularities

  4. Image quality of low-dose CCTA in obese patients: impact of high-definition computed tomography and adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Gebhard, Cathérine; Fuchs, Tobias A; Fiechter, Michael; Stehli, Julia; Stähli, Barbara E; Gaemperli, Oliver; Kaufmann, Philipp A

    2013-10-01

    The accuracy of coronary computed tomography angiography (CCTA) in obese persons is compromised by increased image noise. We investigated CCTA image quality acquired on a high-definition 64-slice CT scanner using modern adaptive statistical iterative reconstruction (ASIR). Seventy overweight and obese patients (24 males; mean age 57 years, mean body mass index 33 kg/m(2)) were studied with clinically-indicated contrast enhanced CCTA. Thirty-five patients underwent a standard definition protocol with filtered backprojection reconstruction (SD-FBP) while 35 patients matched for gender, age, body mass index and coronary artery calcifications underwent a novel high definition protocol with ASIR (HD-ASIR). Segment by segment image quality was assessed using a four-point scale (1 = excellent, 2 = good, 3 = moderate, 4 = non-diagnostic) and revealed better scores for HD-ASIR compared to SD-FBP (1.5 ± 0.43 vs. 1.8 ± 0.48; p ASIR as compared to 1.4 ± 0.4 mm for SD-FBP (p ASIR (388.3 ± 109.6 versus 350.6 ± 90.3 Hounsfield Units, HU; p ASIR vs. SD-ASIR respectively). Compared to a standard definition backprojection protocol (SD-FBP), a newer high definition scan protocol in combination with ASIR (HD-ASIR) incrementally improved image quality and visualization of distal coronary artery segments in overweight and obese individuals, without increasing image noise and radiation dose.

  5. A Statistical study of the Doppler spectral width of high-latitude ionospheric F-region echoes recorded with SuperDARN coherent HF radars

    Directory of Open Access Journals (Sweden)

    J.-P. Villain

    Full Text Available The HF radars of the Super Dual Auroral Radar Network (SuperDARN provide measurements of the E × B drift of ionospheric plasma over extended regions of the high-latitude ionosphere. We have conducted a statistical study of the associated Doppler spectral width of ionospheric F-region echoes. The study has been conducted with all available radars from the Northern Hemisphere for 2 specific periods of time. Period 1 corresponds to the winter months of 1994, while period 2 covers October 1996 to March 1997. The distributions of data points and average spectral width are presented as a function of Magnetic Latitude and Magnetic Local Time. The databases are very consistent and exhibit the same features. The most stringent features are: a region of very high spectral width, collocated with the ionospheric LLBL/cusp/mantle region; an oval shaped region of high spectral width, whose equator-ward boundary matches the poleward limit of the Holzworth and Meng auroral oval. A simulation has been conducted to evaluate the geometrical and instrumental effects on the spectral width. It shows that these effects cannot account for the observed spectral features. It is then concluded that these specific spectral width characteristics are the signature of ionospheric/magnetospheric coupling phenomena.

    Key words. Ionosphere (auroral ionosphere; ionosphere-magnetosphere interactions; ionospheric irregularities

  6. Highly efficient high temperature electrolysis

    DEFF Research Database (Denmark)

    Hauch, Anne; Ebbesen, Sune; Jensen, Søren Højgaard

    2008-01-01

    High temperature electrolysis of water and steam may provide an efficient, cost effective and environmentally friendly production of H-2 Using electricity produced from sustainable, non-fossil energy sources. To achieve cost competitive electrolysis cells that are both high performing i.e. minimum...... internal resistance of the cell, and long-term stable, it is critical to develop electrode materials that are optimal for steam electrolysis. In this article electrolysis cells for electrolysis of water or steam at temperatures above 200 degrees C for production of H-2 are reviewed. High temperature...... electrolysis is favourable from a thermodynamic point of view, because a part of the required energy can be supplied as thermal heat, and the activation barrier is lowered increasing the H-2 production rate. Only two types of cells operating at high temperature (above 200 degrees C) have been described...

  7. High Line

    DEFF Research Database (Denmark)

    Kiib, Hans

    2015-01-01

    At just over 10 meters above street level, the High Line extends three kilometers through three districts of Southwestern Manhattan in New York. It consists of simple steel construction, and previously served as an elevated rail line connection between Penn Station on 34th Street and the many....... The High Line project has been carried out as part of an open conversion strategy. The result is a remarkable urban architectural project, which works as a catalyst for the urban development of Western Manhattan. The greater project includes the restoration and reuse of many old industrial buildings...

  8. Seeing beyond statistics: Examining the potential for disjuncture between legislation, policy and practice in meeting the needs of highly able Scottish students

    Directory of Open Access Journals (Sweden)

    Niamh Stack

    2015-03-01

    Full Text Available The question of how best to identify and provide for gifted students has a long and contentious history internationally. In contrast to other countries where there are specialist programmes and in some cases specialist teachers for gifted pupils, Scotland has chosen to adopt an inclusive approach to provision for these students and has created a legislative and curricular framework that in theory provides a strong structure for meeting their educational and developmental needs. While there are significant benefits to this approach, care must be taken to ensure that within the space between intention and practice the needs of these learners have been explicitly identified, considered and met. Each year the Scottish Government conducts a census to collect data from all publically funded schools in Scotland. In accordance with Scottish legislation as part of this process it gathers data pertaining to pupils identified as requiring additional support for their learning, including highly able pupils. However there are anomalies within this data, for example, there are unusual and unexplained discrepancies between the proportions of pupils identified as being highly able in different geographical contexts. The purpose of the present study was therefore to examine the potential causes for these anomalies and to assess the implications for the identification of, and provision for, highly able pupils in Scotland. Thirteen structured telephone interviews were conducted with Local Education Authority personnel across Scotland. These interviews aimed to get behind the statistics and examine how highly able pupils are identified, and provided for, in practice. Several interesting issues emerged from the interviews that may begin to help to explain the anomalies and to help us better understand everyday practice. The results, while encouraging, suggest that there is a need for teachers, educational psychologists, schools and authorities to ensure that the

  9. High Class

    Science.gov (United States)

    Waldecker, Mark

    2005-01-01

    Education administrators make buying decisions for furniture based on many factors. Cost, durability, functionality, safety and aesthetics represent just a few. Those issues always will be important, but gaining greater recognition in recent years has been the role furniture plays in creating positive, high-performance learning environments. The…

  10. High Turbulence

    CERN Multimedia

    EuHIT, Collaboration

    2015-01-01

    As a member of the EuHIT (European High-Performance Infrastructures in Turbulence - see here) consortium, CERN is participating in fundamental research on turbulence phenomena. To this end, the Laboratory provides European researchers with a cryogenic research infrastructure (see here), where the first tests have just been performed.

  11. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  12. Empirical-statistical downscaling of reanalysis data to high-resolution air temperature and specific humidity above a glacier surface (Cordillera Blanca, Peru)

    Science.gov (United States)

    Hofer, Marlis; MöLg, Thomas; Marzeion, Ben; Kaser, Georg

    2010-06-01

    Recently initiated observation networks in the Cordillera Blanca (Peru) provide temporally high-resolution, yet short-term, atmospheric data. The aim of this study is to extend the existing time series into the past. We present an empirical-statistical downscaling (ESD) model that links 6-hourly National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) reanalysis data to air temperature and specific humidity, measured at the tropical glacier Artesonraju (northern Cordillera Blanca). The ESD modeling procedure includes combined empirical orthogonal function and multiple regression analyses and a double cross-validation scheme for model evaluation. Apart from the selection of predictor fields, the modeling procedure is automated and does not include subjective choices. We assess the ESD model sensitivity to the predictor choice using both single-field and mixed-field predictors. Statistical transfer functions are derived individually for different months and times of day. The forecast skill largely depends on month and time of day, ranging from 0 to 0.8. The mixed-field predictors perform better than the single-field predictors. The ESD model shows added value, at all time scales, against simpler reference models (e.g., the direct use of reanalysis grid point values). The ESD model forecast 1960-2008 clearly reflects interannual variability related to the El Niño/Southern Oscillation but is sensitive to the chosen predictor type.

  13. Highly dominating, highly authoritarian personalities.

    Science.gov (United States)

    Altemeyer, Bob

    2004-08-01

    The author considered the small part of the population whose members score highly on both the Social Dominance Orientation scale and the Right-Wing Authoritarianism scale. Studies of these High SDO-High RWAs, culled from samples of nearly 4000 Canadian university students and over 2600 of their parents and reported in the present article, reveal that these dominating authoritarians are among the most prejudiced persons in society. Furthermore, they seem to combine the worst elements of each kind of personality, being power-hungry, unsupportive of equality, manipulative, and amoral, as social dominators are in general, while also being religiously ethnocentric and dogmatic, as right-wing authoritarians tend to be. The author suggested that, although they are small in number, such persons can have considerable impact on society because they are well-positioned to become the leaders of prejudiced right-wing political movements.

  14. Statistical characterization of high-to-medium frequency mesoscale gravity waves by lidar-measured vertical winds and temperatures in the MLT

    Science.gov (United States)

    Lu, Xian; Chu, Xinzhao; Li, Haoyu; Chen, Cao; Smith, John A.; Vadas, Sharon L.

    2017-09-01

    We present the first statistical study of gravity waves with periods of 0.3-2.5 h that are persistent and dominant in the vertical winds measured with the University of Colorado STAR Na Doppler lidar in Boulder, CO (40.1°N, 105.2°W). The probability density functions of the wave amplitudes in temperature and vertical wind, ratios of these two amplitudes, phase differences between them, and vertical wavelengths are derived directly from the observations. The intrinsic period and horizontal wavelength of each wave are inferred from its vertical wavelength, amplitude ratio, and a designated eddy viscosity by applying the gravity wave polarization and dispersion relations. The amplitude ratios are positively correlated with the ground-based periods with a coefficient of 0.76. The phase differences between the vertical winds and temperatures (φW -φT) follow a Gaussian distribution with 84.2±26.7°, which has a much larger standard deviation than that predicted for non-dissipative waves ( 3.3°). The deviations of the observed phase differences from their predicted values for non-dissipative waves may indicate wave dissipation. The shorter-vertical-wavelength waves tend to have larger phase difference deviations, implying that the dissipative effects are more significant for shorter waves. The majority of these waves have the vertical wavelengths ranging from 5 to 40 km with a mean and standard deviation of 18.6 and 7.2 km, respectively. For waves with similar periods, multiple peaks in the vertical wavelengths are identified frequently and the ones peaking in the vertical wind are statistically longer than those peaking in the temperature. The horizontal wavelengths range mostly from 50 to 500 km with a mean and median of 180 and 125 km, respectively. Therefore, these waves are mesoscale waves with high-to-medium frequencies. Since they have recently become resolvable in high-resolution general circulation models (GCMs), this statistical study provides an important

  15. CFD simulation of CO_2 sorption on K_2CO_3 solid sorbent in novel high flux circulating-turbulent fluidized bed riser: Parametric statistical experimental design study

    International Nuclear Information System (INIS)

    Thummakul, Theeranan; Gidaspow, Dimitri; Piumsomboon, Pornpote; Chalermsinsuwan, Benjapon

    2017-01-01

    Highlights: • Circulating-turbulent fluidization was proved to be advantage on CO_2 sorption. • The novel regime was proven to capture CO_2 higher than the conventional regimes. • Uniform solid particle distribution was observed in the novel fluidization regime. • The system continuity had more effect in the system than the process system mixing. • Parametric experimental design analysis was studied to evaluate significant factor. - Abstract: In this study a high flux circulating-turbulent fluidized bed (CTFB) riser was confirmed to be advantageous for carbon dioxide (CO_2) sorption on a potassium carbonate solid sorbent. The effect of various parameters on the CO_2 removal level was evaluated using a statistical experimental design. The most appropriate fluidization regime was found to occur between the turbulent and fast fluidization regimes, which was shown to capture CO_2 more efficiently than conventional fluidization regimes. The highest CO_2 sorption level was 93.4% under optimized CTFB operating conditions. The important parameters for CO_2 capture were the inlet gas velocity and the interactions between the CO_2 concentration and the inlet gas velocity and water vapor concentration. The CTFB regime had a high and uniform solid particle distribution in both the axial and radial system directions and could transport the solid sorbent to the regeneration reactor. In addition, the process system continuity had a stronger effect on the CO_2 removal level in the system than the process system mixing.

  16. Classification of the medicinal plants of the genus Atractylodes using high-performance liquid chromatography with diode array and tandem mass spectrometry detection combined with multivariate statistical analysis.

    Science.gov (United States)

    Cho, Hyun-Deok; Kim, Unyong; Suh, Joon Hyuk; Eom, Han Young; Kim, Junghyun; Lee, Seul Gi; Choi, Yong Seok; Han, Sang Beom

    2016-04-01

    Analytical methods using high-performance liquid chromatography with diode array and tandem mass spectrometry detection were developed for the discrimination of the rhizomes of four Atractylodes medicinal plants: A. japonica, A. macrocephala, A. chinensis, and A. lancea. A quantitative study was performed, selecting five bioactive components, including atractylenolide I, II, III, eudesma-4(14),7(11)-dien-8-one and atractylodin, on twenty-six Atractylodes samples of various origins. Sample extraction was optimized to sonication with 80% methanol for 40 min at room temperature. High-performance liquid chromatography with diode array detection was established using a C18 column with a water/acetonitrile gradient system at a flow rate of 1.0 mL/min, and the detection wavelength was set at 236 nm. Liquid chromatography with tandem mass spectrometry was applied to certify the reliability of the quantitative results. The developed methods were validated by ensuring specificity, linearity, limit of quantification, accuracy, precision, recovery, robustness, and stability. Results showed that cangzhu contained higher amounts of atractylenolide I and atractylodin than baizhu, and especially atractylodin contents showed the greatest variation between baizhu and cangzhu. Multivariate statistical analysis, such as principal component analysis and hierarchical cluster analysis, were also employed for further classification of the Atractylodes plants. The established method was suitable for quality control of the Atractylodes plants. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Using iterative cluster merging with improved gap statistics to perform online phenotype discovery in the context of high-throughput RNAi screens

    Directory of Open Access Journals (Sweden)

    Sun Youxian

    2008-06-01

    Full Text Available Abstract Background The recent emergence of high-throughput automated image acquisition technologies has forever changed how cell biologists collect and analyze data. Historically, the interpretation of cellular phenotypes in different experimental conditions has been dependent upon the expert opinions of well-trained biologists. Such qualitative analysis is particularly effective in detecting subtle, but important, deviations in phenotypes. However, while the rapid and continuing development of automated microscope-based technologies now facilitates the acquisition of trillions of cells in thousands of diverse experimental conditions, such as in the context of RNA interference (RNAi or small-molecule screens, the massive size of these datasets precludes human analysis. Thus, the development of automated methods which aim to identify novel and biological relevant phenotypes online is one of the major challenges in high-throughput image-based screening. Ideally, phenotype discovery methods should be designed to utilize prior/existing information and tackle three challenging tasks, i.e. restoring pre-defined biological meaningful phenotypes, differentiating novel phenotypes from known ones and clarifying novel phenotypes from each other. Arbitrarily extracted information causes biased analysis, while combining the complete existing datasets with each new image is intractable in high-throughput screens. Results Here we present the design and implementation of a novel and robust online phenotype discovery method with broad applicability that can be used in diverse experimental contexts, especially high-throughput RNAi screens. This method features phenotype modelling and iterative cluster merging using improved gap statistics. A Gaussian Mixture Model (GMM is employed to estimate the distribution of each existing phenotype, and then used as reference distribution in gap statistics. This method is broadly applicable to a number of different types of

  18. Investigation of the Effects of High-Intensity, Intermittent Exercise and Unanticipation on Trunk and Lower Limb Biomechanics During a Side-Cutting Maneuver Using Statistical Parametric Mapping.

    Science.gov (United States)

    Whyte, Enda F; Richter, Chris; OʼConnor, Siobhan; Moran, Kieran A

    2018-06-01

    Whyte, EF, Richter, C, O'Connor, S, and Moran, KA. Investigation of the effects of high-intensity, intermittent exercise and unanticipation on trunk and lower limb biomechanics during a side-cutting maneuver using statistical parametric mapping. J Strength Cond Res 32(6): 1583-1593, 2018-Anterior cruciate ligament (ACL) injuries frequently occur during side-cutting maneuvers when fatigued or reacting to the sporting environment. Trunk and hip biomechanics are proposed to influence ACL loading during these activities. However, the effects of fatigue and unanticipation on the biomechanics of the kinetic chain may be limited by traditional discrete point analysis. We recruited 28 male, varsity, Gaelic footballers (21.7 ± 2.2 years; 178.7 ± 14.6 m; 81.8 ± 11.4 kg) to perform anticipated and unanticipated side-cutting maneuvers before and after a high-intensity, intermittent exercise protocol (HIIP). Statistical parametric mapping (repeated-measures analysis of varience) identified differences in phases of trunk and stance leg biomechanics during weight acceptance. Unanticipation resulted in less trunk flexion (p < 0.001) and greater side flexion away from the direction of cut (p < 0.001). This led to smaller (internal) knee flexor and greater (internal) knee extensor (p = 0.002-0.007), hip adductor (p = 0.005), and hip external rotator (p = 0.007) moments. The HIIP resulted in increased trunk flexion (p < 0.001) and side flexion away from the direction of cut (p = 0.038), resulting in smaller (internal) knee extensor moments (p = 0.006). One interaction effect was noted demonstrating greater hip extensor moments in the unanticipated condition post-HIIP (p = 0.025). Results demonstrate that unanticipation resulted in trunk kinematics considered an ACL injury risk factor. A subsequent increase in frontal and transverse plane hip loading and sagittal plane knee loading was observed, which may increase ACL strain. Conversely, HIIP-induced trunk kinematic alterations

  19. Comparison of past and future Mediterranean high and low extremes of precipitation and river flow projected using different statistical downscaling methods

    Directory of Open Access Journals (Sweden)

    P. Quintana-Seguí

    2011-05-01

    Full Text Available The extremes of precipitation and river flow obtained using three different statistical downscaling methods applied to the same regional climate simulation have been compared. The methods compared are the anomaly method, quantile mapping and a weather typing. The hydrological model used in the study is distributed and it is applied to the Mediterranean basins of France. The study shows that both quantile mapping and weather typing methods are able to reproduce the high and low precipitation extremes in the region of interest. The study also shows that when the hydrological model is forced with these downscaled data, there are important differences in the outputs. This shows that the model amplifies the differences and that the downscaling of other atmospheric variables might be very relevant when simulating river discharges. In terms of river flow, the method of the anomalies, which is very simple, performs better than expected. The methods produce qualitatively similar future scenarios of the extremes of river flow. However, quantitatively, there are still significant differences between them for each individual gauging station. According to these scenarios, it is expected that in the middle of the 21st century (2035–2064, the monthly low flows will have diminished almost everywhere in the region of our study by as much as 20 %. Regarding high flows, there will be important increases in the area of the Cévennes, which is already seriously affected by flash-floods. For some gauging stations in this area, the frequency of what was a 10-yr return flood at the end of the 20th century is expected to increase, with such return floods then occurring every two years in the middle of the 21st century. Similarly, the 10-yr return floods at that time are expected to carry 100 % more water than the 10-yr return floods experienced at the end of the 20th century. In the northern part of the Rhône basin, these extremes will be reduced.

  20. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method

    Science.gov (United States)

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847

  1. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 1: Method

    Science.gov (United States)

    Norris, Peter M.; Da Silva, Arlindo M.

    2016-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  2. High energy

    International Nuclear Information System (INIS)

    Bonner, B.E.; Roberts, J.B. Jr.

    1993-01-01

    We report here on progress made for the period from December 1, 1992 (the date of submission of our latest progress report) to November 30, 1993 for DOE Grant No. DE-FG05-92ER40717. The new results from the SMC experiment have generated a buzz of theoretical activity. Our involvement with the D0 experiment and the upgrade has increased substantially during the past two years so that we now have six people heavily committed and making what can only be described as a large and disproportionate impact on D0 physics output. Some of the new developments made here at Rice in Neural Network and Probability Density Estimation techniques for data analysis promise to have applications both in D0 and beyond. We report a load of new results from our high-p t jet photoproduction experiment. In addition we have been working on KTeV, albeit without having adequate funding for this work. Progress on the theoretical front has been nothing short of amazing, as is reported herein. In a grand lecture tour during this sabbatical year, Paul Stevenson has already reported his breakthroughs at ten institutions, including CERN, Oxford, Cambridge, Rutherford Lab, Imperial College, and Durham University. The group at Rice University has had an exceptionally productive year and we are justifiably proud of the progress which is reported here

  3. Error correction and statistical analyses for intra-host comparisons of feline immunodeficiency virus diversity from high-throughput sequencing data.

    Science.gov (United States)

    Liu, Yang; Chiaromonte, Francesca; Ross, Howard; Malhotra, Raunaq; Elleder, Daniel; Poss, Mary

    2015-06-30

    in G to A substitutions, but found no evidence for this host defense strategy. Our error correction approach for minor allele frequencies (more sensitive and computationally efficient than other algorithms) and our statistical treatment of variation (ANOVA) were critical for effective use of high-throughput sequencing data in understanding viral diversity. We found that co-infection with PLV shifts FIV diversity from bone marrow to lymph node and spleen.

  4. Statistical reconstruction for cone-beam CT with a post-artifact-correction noise model: application to high-quality head imaging

    International Nuclear Information System (INIS)

    Dang, H; Stayman, J W; Sisniega, A; Xu, J; Zbijewski, W; Siewerdsen, J H; Wang, X; Foos, D H; Aygun, N; Koliatsos, V E

    2015-01-01

    Non-contrast CT reliably detects fresh blood in the brain and is the current front-line imaging modality for intracranial hemorrhage such as that occurring in acute traumatic brain injury (contrast ∼40–80 HU, size  >  1 mm). We are developing flat-panel detector (FPD) cone-beam CT (CBCT) to facilitate such diagnosis in a low-cost, mobile platform suitable for point-of-care deployment. Such a system may offer benefits in the ICU, urgent care/concussion clinic, ambulance, and sports and military theatres. However, current FPD-CBCT systems face significant challenges that confound low-contrast, soft-tissue imaging. Artifact correction can overcome major sources of bias in FPD-CBCT but imparts noise amplification in filtered backprojection (FBP). Model-based reconstruction improves soft-tissue image quality compared to FBP by leveraging a high-fidelity forward model and image regularization. In this work, we develop a novel penalized weighted least-squares (PWLS) image reconstruction method with a noise model that includes accurate modeling of the noise characteristics associated with the two dominant artifact corrections (scatter and beam-hardening) in CBCT and utilizes modified weights to compensate for noise amplification imparted by each correction. Experiments included real data acquired on a FPD-CBCT test-bench and an anthropomorphic head phantom emulating intra-parenchymal hemorrhage. The proposed PWLS method demonstrated superior noise-resolution tradeoffs in comparison to FBP and PWLS with conventional weights (viz. at matched 0.50 mm spatial resolution, CNR = 11.9 compared to CNR = 5.6 and CNR = 9.9, respectively) and substantially reduced image noise especially in challenging regions such as skull base. The results support the hypothesis that with high-fidelity artifact correction and statistical reconstruction using an accurate post-artifact-correction noise model, FPD-CBCT can achieve image quality allowing reliable detection of

  5. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results

    Science.gov (United States)

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational–Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by

  6. Monte Carlo Bayesian Inference on a Statistical Model of Sub-gridcolumn Moisture Variability Using High-resolution Cloud Observations . Part II; Sensitivity Tests and Results

    Science.gov (United States)

    da Silva, Arlindo M.; Norris, Peter M.

    2013-01-01

    Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.

  7. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 2: Sensitivity Tests and Results

    Science.gov (United States)

    Norris, Peter M.; da Silva, Arlindo M.

    2016-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by

  8. Hemispheric Differences in White Matter Microstructure between Two Profiles of Children with High Intelligence Quotient vs. Controls: A Tract-Based Spatial Statistics Study

    Science.gov (United States)

    Nusbaum, Fanny; Hannoun, Salem; Kocevar, Gabriel; Stamile, Claudio; Fourneret, Pierre; Revol, Olivier; Sappey-Marinier, Dominique

    2017-01-01

    Objectives: The main goal of this study was to investigate and compare the neural substrate of two children's profiles of high intelligence quotient (HIQ). Methods: Two groups of HIQ children were included with either a homogeneous (Hom-HIQ: n = 20) or a heterogeneous IQ profile (Het-HIQ: n = 24) as defined by a significant difference between verbal comprehension index and perceptual reasoning index. Diffusion tensor imaging was used to assess white matter (WM) microstructure while tract-based spatial statistics (TBSS) analysis was performed to detect and localize WM regional differences in fractional anisotropy (FA), mean diffusivity, axial (AD), and radial diffusivities. Quantitative measurements were performed on 48 regions and 21 fiber-bundles of WM. Results: Hom-HIQ children presented higher FA than Het-HIQ children in widespread WM regions including central structures, and associative intra-hemispheric WM fasciculi. AD was also greater in numerous WM regions of Total-HIQ, Hom-HIQ, and Het-HIQ groups when compared to the Control group. Hom-HIQ and Het-HIQ groups also differed by their hemispheric lateralization in AD differences compared to Controls. Het-HIQ and Hom-HIQ groups showed a lateralization ratio (left/right) of 1.38 and 0.78, respectively. Conclusions: These findings suggest that both inter- and intra-hemispheric WM integrity are enhanced in HIQ children and that neural substrate differs between Hom-HIQ and Het-HIQ. The left hemispheric lateralization of Het-HIQ children is concordant with their higher verbal index while the relative right hemispheric lateralization of Hom-HIQ children is concordant with their global brain processing and adaptation capacities as evidenced by their homogeneous IQ. PMID:28420955

  9. Application of the kurtosis statistic to the evaluation of the risk of hearing loss in workers exposed to high-level complex noise.

    Science.gov (United States)

    Zhao, Yi-Ming; Qiu, Wei; Zeng, Lin; Chen, Shan-Song; Cheng, Xiao-Ru; Davis, Robert I; Hamernik, Roger P

    2010-08-01

    Develop dose-response relations for two groups of industrial workers exposed to Gaussian or non-Gaussian (complex) types of continuous noises and to investigate what role, if any, the kurtosis statistic can play in the evaluation of industrial noise-induced hearing loss (NIHL). Audiometric and noise exposure data were acquired on a population (N = 195) of screened workers from a textile manufacturing plant and a metal fabrication facility located in Henan province of China. Thirty-two of the subjects were exposed to non-Gaussian (non-G) noise and 163 were exposed to a Gaussian (G) continuous noise. Each subject was given a general physical and an otologic examination. Hearing threshold levels (0.5-8.0 kHz) were age adjusted (ISI-1999) and the prevalence of NIHL at 3, 4, or 6 kHz was determined. The kurtosis metric, which is sensitive to the peak and temporal characteristics of a noise, was introduced into the calculation of the cumulative noise exposure metric. Using the prevalence of hearing loss and the cumulative noise exposure metric, a dose-response relation for the G and non-G noise-exposed groups was constructed. An analysis of the noise environments in the two plants showed that the noise exposures in the textile plant were of a Gaussian type with an Leq(A)8hr that varied from 96 to 105 dB whereas the exposures in the metal fabrication facility with an Leq(A)8hr = 95 dB were of a non-G type containing high levels (up to 125 dB peak SPL) of impact noise. The kurtosis statistic was used to quantify the deviation of the non-G noise environment from the Gaussian. The dose-response relation for the non-G noise-exposed subjects showed a higher prevalence of hearing loss for a comparable cumulative noise exposure than did the G noise-exposed subjects. By introducing the kurtosis variable into the temporal component of the cumulative noise exposure calculation, the two dose-response curves could be made to overlap, essentially yielding an equivalent noise

  10. Treatment of high-level radioactive wastes. Statistic and thematic analysis of works state in the world, in Russia and in Ukraine

    International Nuclear Information System (INIS)

    Trofimenko, A.P.; Pisanko, Zh.I.

    2000-01-01

    INIS information database was used for study the problem of high-level radioactive waste management . The main directions of works were determined by the subject content of the mostly used key-words to particular publications. The problems of site selection for waste storage, ecological and social aspects of waste management role of particular radionuclides in it were studied. Dynamics of work development in time, and contribution of different countries, in particular Russia and Ukraine, in this problem were considered

  11. Infodemiological data of high-school drop-out related web searches in Canada correlating with real-world statistical data in the period 2004–2012

    Directory of Open Access Journals (Sweden)

    Anna Siri

    2016-12-01

    Examining the data broken down by gender, the correlations were higher and statistically significant in males than in females. GT-based data for drop-out resulted best modeled by an ARMA(1,0 model. Considering the cross correlation of Canadian regions, all of them resulted statistically significant at lag 0, apart from for New Brunswick, Newfoundland and Labrador and the Prince Edward island. A number or cross-correlations resulted statistically significant also at lag −1 (namely, Alberta, Manitoba, New Brunswick and Saskatchewan.

  12. Permafrost Distribution along the Qinghai-Tibet Engineering Corridor, China Using High-Resolution Statistical Mapping and Modeling Integrated with Remote Sensing and GIS

    Directory of Open Access Journals (Sweden)

    Fujun Niu

    2018-02-01

    Full Text Available Permafrost distribution in the Qinghai-Tibet Engineering Corridor (QTEC is of growing interest due to the increase in infrastructure development in this remote area. Empirical models of mountain permafrost distribution have been established based on field sampled data, as a tool for regional-scale assessments of its distribution. This kind of model approach has never been applied for a large portion of this engineering corridor. In the present study, this methodology is applied to map permafrost distribution throughout the QTEC. After spatial modelling of the mean annual air temperature distribution from MODIS-LST and DEM, using high-resolution satellite image to interpret land surface type, a permafrost probability index was obtained. The evaluation results indicate that the model has an acceptable performance. Conditions highly favorable to permafrost presence (≥70% are predicted for 60.3% of the study area, declaring a discontinuous permafrost distribution in the QTEC. This map is useful for the infrastructure development along the QTEC. In the future, local ground-truth observations will be required to confirm permafrost presence in favorable areas and to monitor permafrost evolution under the influence of climate change.

  13. Statistical analysis of causes of death (2005-2010) in villages of Simav Plain, Turkey, with high arsenic levels in drinking water supplies.

    Science.gov (United States)

    Gunduz, Orhan; Bakar, Coskun; Simsek, Celalettin; Baba, Alper; Elci, Alper; Gurleyuk, Hakan; Mutlu, Merdiye; Cakir, Ayse

    2015-01-01

    The purpose of this research was to compare the causes of death in 5 villages situated in Simav Plain, Turkey, during 2005-2010 where different arsenic levels were detected in drinking water supplies. Since groundwater in Simav Plain had arsenic concentrations that ranged between 7.1 and 833.9 ppb, a two-phase research was formulated. In the first phase, public health surveys were conducted with 1,003 villagers to determine the distribution of diseases. In the second phase, verbal autopsy surveys and official death records were used to investigate the causes of death. In total, 402 death cases were found in the study area where cardiovascular system diseases (44%) and cancers (15.2%) were major causes. Cancers of lung (44.3%), prostate (9.8%), colon (9.8%), and stomach (8.2%) were comparably higher in villages with high arsenic levels in drinking water supplies. Furthermore, the majority of cases of liver, bladder, and stomach cancers were observed in villages with high arsenic levels.

  14. Statistical Analysis of Past Catalytic Data on Oxidative Methane Coupling for New Insights into the Composition of High-Performance Catalysts

    Czech Academy of Sciences Publication Activity Database

    Zavyalova, U.; Holeňa, Martin; Schlögl, R.; Baerns, M.

    2011-01-01

    Roč. 3, č. 12 (2011), s. 1935-1947 ISSN 1867-3880 Institutional research plan: CEZ:AV0Z10300504 Keywords : catalyst development * heterogeneous catalysis * methane * oxidative coupling * catalyst composition * statistical analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 5.207, year: 2011

  15. A Monte Carlo Simulation Comparing the Statistical Precision of Two High-Stakes Teacher Evaluation Methods: A Value-Added Model and a Composite Measure

    Science.gov (United States)

    Spencer, Bryden

    2016-01-01

    Value-added models are a class of growth models used in education to assign responsibility for student growth to teachers or schools. For value-added models to be used fairly, sufficient statistical precision is necessary for accurate teacher classification. Previous research indicated precision below practical limits. An alternative approach has…

  16. Statistical mechanics of light elements at high pressure. V Three-dimensional Thomas-Fermi-Dirac theory. [relevant to Jovian planetary interiors

    Science.gov (United States)

    Macfarlane, J. J.; Hubbard, W. B.

    1983-01-01

    A numerical technique for solving the Thomas-Fermi-Dirac (TED) equation in three dimensions, for an array of ions obeying periodic boundary conditions, is presented. The technique is then used to calculate deviations from ideal mixing for an alloy of hydrogen and helium at zero temperature and high presures. Results are compared with alternative models which apply perturbation theory to calculation of the electron distribution, based upon the assumption of weak response of the electron gas to the ions. The TFD theory, which permits strong electron response, always predicts smaller deviations from ideal mixing than would be predicted by perturbation theory. The results indicate that predicted phase separation curves for hydrogen-helium alloys under conditions prevailing in the metallic zones of Jupiter and Saturn are very model dependent.

  17. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    International Nuclear Information System (INIS)

    Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin

    2015-01-01

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites

  18. A Novel High Performance Liquid Chromatographic Method for Determination of Nystatin in Pharmaceutical Formulations by Box-Behnken Statistical Experiment Design.

    Science.gov (United States)

    Shokraneh, Farnaz; Asgharian, Ramin; Abdollahpour, Assem; Ramin, Mehdi; Montaseri, Ali; Mahboubi, Arash

    2015-01-01

    In this study a novel High Performance Liquid Chromatography for the assay of nystatin in oral and vaginal tablets were optimized and validated using Box-Behnken experimental design. The method was performed in the isocratic mode on a RP-18 column (30 °C) using a mobile phase consisting of ammonium acetate 0.05 M buffer/ Methanol mixture (30:70) and a flow-rate of 1.0 mL/min. The specificity, linearity, precision, accuracy, LOD and LOQ of the method were validated. The method was linear over the range of 5-500 µg/mL with an acceptable correlation coefficient (r(2) = 0.9996). The method's limit of detection (LOD) and quantification (LOQ) were 0.01 and 0.025 µg/mL respectively. The results indicate that this validated method can be used as an alternative method for assay of nystatin.

  19. High Combustion Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL's High-Pressure Combustion Research Facility in Morgantown, WV, researchers can investigate new high-pressure, high-temperature hydrogen turbine combustion...

  20. A high-statistics measurement of transverse spin effects in dihadron production from muon-proton semi-inclusive deep-inelastic scattering

    OpenAIRE

    Adolph, C.; Akhunzyanov, R.; Alekseev, M. G.; Alexandrov, Y.; Alexeev, G. D.; Amoroso, A.; Andrieux, V.; Anosov, V.; Austregesilo, A.; Badelek, B.; Balestra, F.; Barth, J.; Baum, G.; Beck, R.; Bedfer, Y.

    2014-01-01

    A measurement of the azimuthal asymmetry in dihadron production in deep-inelastic scattering of muons on transversely polarised proton (NH3) targets is presented. They provide independent access to the transversity distribution functions through the measurement of the Collins asymmetry in single hadron production. The data were taken in the year 2010 with the COMPASS spectrometer using a 160 GeV/c muon beam of the CERN SPS, increasing by a factor of about four the overall statistics with resp...

  1. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada.

    Science.gov (United States)

    Smylie, Janet; Firestone, Michelle

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.

  2. A high-statistics measurement of transverse spin effects in dihadron production from muon-proton semi-inclusive deep-inelastic scattering

    CERN Document Server

    Adolph, C; Alekseev, M G; Alexandrov, Yu; Alexeev, G D; Amoroso, A; Andrieux, V; Anosov, V; Austregesilo, A; Badelek, B; Balestra, F; Barth, J; Baum, G; Beck, R; Bedfer, Y; Berlin, A; Bernhard, J; Bertini, R; Bicker, K; Bieling, J; Birsa, R; Bisplinghoff, J; Bodlak, M; Boer, M; Bordalo, P; Bradamante, F; Braun, C; Bravar, A; Bressan, A; Buchele, M; Burtin, E; Capozza, L; Chiosso, M; Chung, S U; Cicuttin, A; Crespo, M L; Curiel, Q; Dalla Torre, S; Dasgupta, S S; Dasgupta, S; Denisov, O Yu; Donskov, S V; Doshita, N; Duic, V; Dunnweber, W; Dziewiecki, M; Efremov, A; Elia, C; Eversheim, P.D; Eyrich, W; Faessler, M; Ferrero, A; Filin, A; Finger, M; Finger jr, M; Fischer, H; Franco, C; du Fresne von Hohenesche, N; Friedrich, J M; Frolov, V; Garfagnini, R; Gautheron, F; Gavrichtchouk, O P; Gerassimov, S; Geyer, R; Giorgi, M; Gnesi, I; Gobbo, B; Goertz, S; Gorzellik, M; Grabmuller, S; Grasso, A; Grube, B; Guskov, A; Guthorl, T; Haas, F; von Harrach, D; Hahne, D; Hashimoto, R; Heinsius, F H; Herrmann, F; Hinterberger, F; Hoppner, Ch; Horikawa, N; d'Hose, N; Huber, S; Ishimoto, S; Ivanov, A; Ivanshin, Yu; Iwata, T; Jahn, R; Jary, V; Jasinski, P; Joerg, P; Joosten, R; Kabuss, E; Kang, D; Ketzer, B; Khaustov, G V; Khokhlov, Yu A; Kisselev, Yu; Klein, F; Klimaszewski, K; Koivuniemi, J H; Kolosov, V N; Kondo, K; Konigsmann, K; Konorov, I; Konstantinov, V F; Kotzinian, A M; Kouznetsov, O; Kral, Z; Kramer, M; Kroumchtein, Z V; Kuchinski, N; Kunne, F; Kurek, K; Kurjata, R P; Lednev, A A; Lehmann, A; Levorato, S; Lichtenstadt, J; Maggiora, A; Magnon, A; Makke, N; Mallot, G K; Marchand, C; Martin, A; Marzec, J; Matousek, J; Matsuda, H; Matsuda, T; Meshcheryakov, G; Meyer, W; Michigami, T; Mikhailov, Yu. V; Miyachi, Y; Nagaytsev, A; Nagel, T; Nerling, F; Neubert, S; Neyret, D; Nikolaenko, V I; Novy, J; Nowak, W D; Nunes, A S; Orlov, I; Olshevsky, A G; Ostrick, M; Panknin, R; Panzieri, D; Parsamyan, B; Paul, S; Pesek, M; Peshekhonov, D; Piragino, G; Platchkov, S; Pochodzalla, J; Polak, J; Polyakov, V A; Pretz, J; Quaresma, M; Quintans, C; Ramos, S; Reicherz, G; Rocco, E; Rodionov, V; Rondio, E; Rychter, A; Rossiyskaya, N S; Ryabchikov, D I; Samoylenko, V D; Sandacz, A; Sarkar, S; Savin, I A; Sbrizzai, G; Schiavon, P; Schill, C; Schluter, T; Schmidt, A; Schmidt, K; Schmieden, H; Schonning, K; Schopferer, S; Schott, M; Shevchenko, O Yu; Silva, L; Sinha, L; Sirtl, S; Slunecka, M; Sosio, S; Sozzi, F; Srnka, A; Steiger, L; Stolarski, M; Sulc, M; Sulej, R; Suzuki, H; Szabeleski, A; Szameitat, T; Sznajder, P; Takekawa, S; Ter Wolbeek, J; Tessaro, S; Tessarotto, F; Thibaud, F; Uhl, S; Uman, I; Vandenbroucke, M; Virius, M; Vondra, J; Wang, L; Weisrock, T; Wilfert, M; Windmolders, R; Wislicki, W; Wollny, H; Zaremba, K; Zavertyaev, M; Zemlyanichkina, E; Ziembicki, M

    2014-01-01

    A measurement of the azimuthal asymmetry in dihadron production in deep-inelastic scattering of muons on transversely polarised proton (NH$_{3}$) targets are presented. They provide independent access to the transversity distribution functions through the measurement of the Collins asymmetry in single hadron production. The data were taken in the year $2010$ with the COMPASS spectrometer using a $160\\,\\mbox{GeV}/c$ muon beam of the CERN SPS, increasing by a factor of about three the available statistics of the previously published data taken in the year $2007$. The measured sizeable asymmetry is in good agreement with the published data. An approximate equality of the Collins asymmetry and the dihadron asymmetry is observed, suggesting a common physical mechanism in the underlying fragmentation.

  3. A high-statistics measurement of transverse spin effects in dihadron production from muon–proton semi-inclusive deep-inelastic scattering

    Directory of Open Access Journals (Sweden)

    C. Adolph

    2014-09-01

    Full Text Available A measurement of the azimuthal asymmetry in dihadron production in deep-inelastic scattering of muons on transversely polarised proton (NH3 targets is presented. They provide independent access to the transversity distribution functions through the measurement of the Collins asymmetry in single hadron production. The data were taken in the year 2010 with the COMPASS spectrometer using a 160 GeV/c muon beam of the CERN SPS, increasing by a factor of about four the overall statistics with respect to the previously published data taken in the year 2007. The measured sizeable asymmetry is in good agreement with the published data. An approximate equality of the Collins asymmetry and the dihadron asymmetry is observed, suggesting a common physical mechanism in the underlying fragmentation.

  4. Treatment simulations with a statistical deformable motion model to evaluate margins for multiple targets in radiotherapy for high-risk prostate cancer

    International Nuclear Information System (INIS)

    Thörnqvist, Sara; Hysing, Liv B.; Zolnay, Andras G.; Söhn, Matthias; Hoogeman, Mischa S.; Muren, Ludvig P.; Bentzen, Lise; Heijmen, Ben J.M.

    2013-01-01

    Background and purpose: Deformation and correlated target motion remain challenges for margin recipes in radiotherapy (RT). This study presents a statistical deformable motion model for multiple targets and applies it to margin evaluations for locally advanced prostate cancer i.e. RT of the prostate (CTV-p), seminal vesicles (CTV-sv) and pelvic lymph nodes (CTV-ln). Material and methods: The 19 patients included in this study, all had 7–10 repeat CT-scans available that were rigidly aligned with the planning CT-scan using intra-prostatic implanted markers, followed by deformable registrations. The displacement vectors from the deformable registrations were used to create patient-specific statistical motion models. The models were applied in treatment simulations to determine probabilities for adequate target coverage, e.g. by establishing distributions of the accumulated dose to 99% of the target volumes (D 99 ) for various CTV–PTV expansions in the planning-CTs. Results: The method allowed for estimation of the expected accumulated dose and its variance of different DVH parameters for each patient. Simulations of inter-fractional motion resulted in 7, 10, and 18 patients with an average D 99 >95% of the prescribed dose for CTV-p expansions of 3 mm, 4 mm and 5 mm, respectively. For CTV-sv and CTV-ln, expansions of 3 mm, 5 mm and 7 mm resulted in 1, 11 and 15 vs. 8, 18 and 18 patients respectively with an average D 99 >95% of the prescription. Conclusions: Treatment simulations of target motion revealed large individual differences in accumulated dose mainly for CTV-sv, demanding the largest margins whereas those required for CTV-p and CTV-ln were comparable

  5. Statistics of strain rates and surface density function in a flame-resolved high-fidelity simulation of a turbulent premixed bluff body burner

    Science.gov (United States)

    Sandeep, Anurag; Proch, Fabian; Kempf, Andreas M.; Chakraborty, Nilanjan

    2018-06-01

    The statistical behavior of the surface density function (SDF, the magnitude of the reaction progress variable gradient) and the strain rates, which govern the evolution of the SDF, have been analyzed using a three-dimensional flame-resolved simulation database of a turbulent lean premixed methane-air flame in a bluff-body configuration. It has been found that the turbulence intensity increases with the distance from the burner, changing the flame curvature distribution and increasing the probability of the negative curvature in the downstream direction. The curvature dependences of dilatation rate ∇ṡu → and displacement speed Sd give rise to variations of these quantities in the axial direction. These variations affect the nature of the alignment between the progress variable gradient and the local principal strain rates, which in turn affects the mean flame normal strain rate, which assumes positive values close to the burner but increasingly becomes negative as the effect of turbulence increases with the axial distance from the burner exit. The axial distance dependences of the curvature and displacement speed also induce a considerable variation in the mean value of the curvature stretch. The axial distance dependences of the dilatation rate and flame normal strain rate govern the behavior of the flame tangential strain rate, and its mean value increases in the downstream direction. The current analysis indicates that the statistical behaviors of different strain rates and displacement speed and their curvature dependences need to be included in the modeling of flame surface density and scalar dissipation rate in order to accurately capture their local behaviors.

  6. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  7. Characterizing and understanding the climatic determinism of high- to low-frequency variations in precipitation in northwestern France using a coupled wavelet multiresolution/statistical downscaling approach

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Hannah, David; Lavers, David; Fossa, Manuel; Laignel, Benoit; Debret, Maxime

    2017-04-01

    Geophysical signals oscillate over several time-scales that explain different amount of their overall variability and may be related to different physical processes. Characterizing and understanding such variabilities in hydrological variations and investigating their determinism is one important issue in a context of climate change, as these variabilities can be occasionally superimposed to long-term trend possibly due to climate change. It is also important to refine our understanding of time-scale dependent linkages between large-scale climatic variations and hydrological responses on the regional or local-scale. Here we investigate such links by conducting a wavelet multiresolution statistical dowscaling approach of precipitation in northwestern France (Seine river catchment) over 1950-2016 using sea level pressure (SLP) and sea surface temperature (SST) as indicators of atmospheric and oceanic circulations, respectively. Previous results demonstrated that including multiresolution decomposition in a statistical downscaling model (within a so-called multiresolution ESD model) using SLP as large-scale predictor greatly improved simulation of low-frequency, i.e. interannual to interdecadal, fluctuations observed in precipitation. Building on these results, continuous wavelet transform of simulated precipiation using multiresolution ESD confirmed the good performance of the model to better explain variability at all time-scales. A sensitivity analysis of the model to the choice of the scale and wavelet function used was also tested. It appeared that whatever the wavelet used, the model performed similarly. The spatial patterns of SLP found as the best predictors for all time-scales, which resulted from the wavelet decomposition, revealed different structures according to time-scale, showing possible different determinisms. More particularly, some low-frequency components ( 3.2-yr and 19.3-yr) showed a much wide-spread spatial extentsion across the Atlantic

  8. High-resolution brain SPECT imaging in attention deficit hyperactivity disorder children without comorbidity: quantitative analysis using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Oh, Eun Young; Chung, Young Ki; Hwang, Isaac; Lee, Jae Sung

    2002-01-01

    We examined the abnormalities of regional cerebral blood flow(rCBF) in children with attention deficit hyperactivity disorder(ADHD) without comorbidity using statistical parametric mapping(SPM) method. We used the patients with not compatible to DSM-IV diagnostic criteria of ADHD and normal rCBF pattern in visual analysis as normal control children. Tc-99m ECD brain SPECT was performed on 75 patients (M:F=64:11, 10.0±2.5y) with the DSM-IV diagnostic criteria of ADHD and 13 normal control children (M:F=9:4, 10.3±4.1y). Using SPM method, we compared patient group's SPECT images with those of 13 control subjects and measured the extent of the area with significant hypoperfusion(p<0.01) in predefined 34 cerebral regions. Only on area of left temporal lobe showed significant hypoperfusion in ADHD patients without comorbidity (n=75) compared with control subjects(n=13). (n=75, p<0.01, extent threshold=16). rCBF of left temporal area was decreased in ADHD group without comorbidity, such as tic, compared with control group

  9. High-resolution brain SPECT imaging in attention deficit hyperactivity disorder children without comorbidity: quantitative analysis using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Oh, Eun Young [Ajou University School of Medicine, Suwon (Korea, Republic of); Chung, Young Ki; Hwang, Isaac; Lee, Jae Sung [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    We examined the abnormalities of regional cerebral blood flow(rCBF) in children with attention deficit hyperactivity disorder(ADHD) without comorbidity using statistical parametric mapping(SPM) method. We used the patients with not compatible to DSM-IV diagnostic criteria of ADHD and normal rCBF pattern in visual analysis as normal control children. Tc-99m ECD brain SPECT was performed on 75 patients (M:F=64:11, 10.0{+-}2.5y) with the DSM-IV diagnostic criteria of ADHD and 13 normal control children (M:F=9:4, 10.3{+-}4.1y). Using SPM method, we compared patient group's SPECT images with those of 13 control subjects and measured the extent of the area with significant hypoperfusion(p<0.01) in predefined 34 cerebral regions. Only on area of left temporal lobe showed significant hypoperfusion in ADHD patients without comorbidity (n=75) compared with control subjects(n=13). (n=75, p<0.01, extent threshold=16). rCBF of left temporal area was decreased in ADHD group without comorbidity, such as tic, compared with control group.

  10. High Performance Networks for High Impact Science

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Mary A.; Bair, Raymond A.

    2003-02-13

    This workshop was the first major activity in developing a strategic plan for high-performance networking in the Office of Science. Held August 13 through 15, 2002, it brought together a selection of end users, especially representing the emerging, high-visibility initiatives, and network visionaries to identify opportunities and begin defining the path forward.

  11. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Español Hyperglycemia (High Blood Glucose) Hyperglycemia is the technical term for high blood glucose (blood sugar). High ... We Are Research Leaders We Support Your Doctor Student Resources Patient Access to Research Research Resources Practice ...

  12. High Blood Pressure

    Science.gov (United States)

    ... normal blood pressure 140/90 or higher is high blood pressure Between 120 and 139 for the top number, ... prehypertension. Prehypertension means you may end up with high blood pressure, unless you take steps to prevent it. High ...

  13. High Blood Pressure Facts

    Science.gov (United States)

    ... Stroke Heart Disease Cholesterol Salt Million Hearts® WISEWOMAN High Blood Pressure Facts Recommend on Facebook Tweet Share Compartir On ... Top of Page CDC Fact Sheets Related to High Blood Pressure High Blood Pressure Pulmonary Hypertension Heart Disease Signs ...

  14. High Blood Pressure (Hypertension)

    Science.gov (United States)

    ... Print Page Text Size: A A A Listen High Blood Pressure (Hypertension) Nearly 1 in 3 American adults has ... weight. How Will I Know if I Have High Blood Pressure? High blood pressure is a silent problem — you ...

  15. Decay properties of high-lying single-particles modes

    Science.gov (United States)

    Beaumel, D.; Fortier, S.; Galès, S.; Guillot, J.; Langevin-Joliot, H.; Laurent, H.; Maison, J. M.; Vernotte, J.; Bordewijck, J.; Brandenburg, S.; Krasznahorkay, A.; Crawley, G. M.; Massolo, C. P.; Renteria, M.; Khendriche, A.

    1996-02-01

    The neutron decay of high-lying single-particle states in 64Ni, 90Zr, 120Sn and 208Pb excited by means of the (α, 3He) reaction has been investigated at 120 MeV incident energy using the multidetector EDEN. The characteristics of this reaction are studied using inclusive spectra and angular correlation analysis. The structure located between 11 and 15 MeV in 91Zr, and between 8 and 12 MeV excitation energy in 209Pb display large departures from a pure statistical decay. The corresponding non-statistical branching ratios are compared with the results of two theoretical calculations.

  16. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  17. High energy physics

    International Nuclear Information System (INIS)

    Kernan, A.; Shen, B.C.; Ma, E.

    1997-01-01

    Hadron collider studies will focus on: (i) the search for the top quark with the newly installed D0 detector at the Fermilab Tevatron collider, (ii) the upgrade of the D0 detector to match the new main injector luminosity and (iii) R ampersand D on silicon microstrip tracking devices for the SSC. High statistics studies of Z 0 decay will continue with the OPAL detector at LEP. These studies will include a direct measurement of Z decay to neutrinos, the search for Higgs and heavy quark decays of Z. Preparations for the Large Scintillation Neutrino Detector (LSND) to measure neutrino oscillations at LAMPF will focus on data acquisition and testing of photomultiplier tubes. In the theoretical area E. Ma will concentrate on mass-generating radiative mechanisms for light quarks and leptons in renormalizable gauge field theories. J. Wudka's program includes a detailed investigation of the magnetic-flip approach to the solar neutrino

  18. High pressure, high current, low inductance, high reliability sealed terminals

    Science.gov (United States)

    Hsu, John S [Oak Ridge, TN; McKeever, John W [Oak Ridge, TN

    2010-03-23

    The invention is a terminal assembly having a casing with at least one delivery tapered-cone conductor and at least one return tapered-cone conductor routed there-through. The delivery and return tapered-cone conductors are electrically isolated from each other and positioned in the annuluses of ordered concentric cones at an off-normal angle. The tapered cone conductor service can be AC phase conductors and DC link conductors. The center core has at least one service conduit of gate signal leads, diagnostic signal wires, and refrigerant tubing routed there-through. A seal material is in direct contact with the casing inner surface, the tapered-cone conductors, and the service conduits thereby hermetically filling the interstitial space in the casing interior core and center core. The assembly provides simultaneous high-current, high-pressure, low-inductance, and high-reliability service.

  19. Space Based Infrared System High (SBIRS High)

    Science.gov (United States)

    2015-12-01

    elements (five SMGTs) for the S2E2 Mobile Ground System. ​ SBIRS Block Buy (GEO 5-6) The GEO 5-6 Tech Refresh (TR) Engineering Change Proposal was...Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-210 Space Based Infrared System High ( SBIRS High) As of FY 2017 President’s Budget Defense...Acquisition Management Information Retrieval (DAMIR) March 23, 2016 11:24:26 UNCLASSIFIED SBIRS High December 2015 SAR March 23, 2016 11:24:26

  20. Bumball: Highly Engaging, Highly Inclusive, and Highly Entertaining

    Science.gov (United States)

    Hall, Amber; Barney, David; Wilkinson, Carol

    2014-01-01

    Physical educators are always looking for new and exciting games and activities in which students can participate. This article describes Bumball, a high-intensity game that provides the opportunity for students to use many common game skills, such as hand-eye coordination, passing to a target, running, playing defense, and getting to an open…

  1. High temperature high vacuum creep testing facilities

    International Nuclear Information System (INIS)

    Matta, M.K.

    1985-01-01

    Creep is the term used to describe time-dependent plastic flow of metals under conditions of constant load or stress at constant high temperature. Creep has an important considerations for materials operating under stresses at high temperatures for long time such as cladding materials, pressure vessels, steam turbines, boilers,...etc. These two creep machines measures the creep of materials and alloys at high temperature under high vacuum at constant stress. By the two chart recorders attached to the system one could register time and temperature versus strain during the test . This report consists of three chapters, chapter I is the introduction, chapter II is the technical description of the creep machines while chapter III discuss some experimental data on the creep behaviour. Of helium implanted stainless steel. 13 fig., 3 tab

  2. Why high energy physics

    International Nuclear Information System (INIS)

    Diddens, A.N.; Van de Walle, R.T.

    1981-01-01

    An argument is presented for high energy physics from the point of view of the practitioners. Three different angles are presented: The cultural consequence and scientific significance of practising high energy physics, the potential application of the results and the discovery of high energy physics, and the technical spin-offs from the techniques and methods used in high energy physics. (C.F.)

  3. Hypertension (High Blood Pressure)

    Science.gov (United States)

    ... Safe Videos for Educators Search English Español Hypertension (High Blood Pressure) KidsHealth / For Teens / Hypertension (High Blood Pressure) What's ... rest temperature diet emotions posture medicines Why Is High Blood Pressure Bad? High blood pressure means a person's heart ...

  4. High-power, high-efficiency FELs

    International Nuclear Information System (INIS)

    Sessler, A.M.

    1989-04-01

    High power, high efficiency FELs require tapering, as the particles loose energy, so as to maintain resonance between the electromagnetic wave and the particles. They also require focusing of the particles (usually done with curved pole faces) and focusing of the electromagnetic wave (i.e. optical guiding). In addition, one must avoid transverse beam instabilities (primarily resistive wall) and longitudinal instabilities (i.e sidebands). 18 refs., 7 figs., 3 tabs

  5. High statistics study of ω0 production

    International Nuclear Information System (INIS)

    Shaevitz, M.H.; Abolins, M.A.; Dankowych, J.A.

    1974-01-01

    Results from a study of π - p → ω 0 n at 6.0 GeV/c based on 28,000 events from a charged and neutral spectrometer are reported. Background under the ω 0 is only 7 percent, a large improvement over deuterium bubble chamber work. Density matrix elements, projected cross sections and effective trajectories for natural and unnatural exchanges are presented

  6. High Temperature, High Power Piezoelectric Composite Transducers

    Science.gov (United States)

    Lee, Hyeong Jae; Zhang, Shujun; Bar-Cohen, Yoseph; Sherrit, StewarT.

    2014-01-01

    Piezoelectric composites are a class of functional materials consisting of piezoelectric active materials and non-piezoelectric passive polymers, mechanically attached together to form different connectivities. These composites have several advantages compared to conventional piezoelectric ceramics and polymers, including improved electromechanical properties, mechanical flexibility and the ability to tailor properties by using several different connectivity patterns. These advantages have led to the improvement of overall transducer performance, such as transducer sensitivity and bandwidth, resulting in rapid implementation of piezoelectric composites in medical imaging ultrasounds and other acoustic transducers. Recently, new piezoelectric composite transducers have been developed with optimized composite components that have improved thermal stability and mechanical quality factors, making them promising candidates for high temperature, high power transducer applications, such as therapeutic ultrasound, high power ultrasonic wirebonding, high temperature non-destructive testing, and downhole energy harvesting. This paper will present recent developments of piezoelectric composite technology for high temperature and high power applications. The concerns and limitations of using piezoelectric composites will also be discussed, and the expected future research directions will be outlined. PMID:25111242

  7. Highly Accreting Quasars at High Redshift

    Directory of Open Access Journals (Sweden)

    Mary L. Martínez-Aldama

    2018-01-01

    Full Text Available We present preliminary results of a spectroscopic analysis for a sample of type 1 highly accreting quasars (L/LEdd ~ 1.0 at high redshift, z ~2–3. The quasars were observed with the OSIRIS spectrograph on the GTC 10.4 m telescope located at the Observatorio del Roque de los Muchachos in La Palma. The highly accreting quasars were identified using the 4D Eigenvector 1 formalism, which is able to organize type 1 quasars over a broad range of redshift and luminosity. The kinematic and physical properties of the broad line region have been derived by fitting the profiles of strong UV emission lines such as Aliiiλ1860, Siiii]λ1892 and Ciii]λ1909. The majority of our sources show strong blueshifts in the high-ionization lines and high Eddington ratios which are related with the productions of outflows. The importance of highly accreting quasars goes beyond a detailed understanding of their physics: their extreme Eddington ratio makes them candidates standard candles for cosmological studies.

  8. Highly Accreting Quasars at High Redshift

    Science.gov (United States)

    Martínez-Aldama, Mary L.; Del Olmo, Ascensión; Marziani, Paola; Sulentic, Jack W.; Negrete, C. Alenka; Dultzin, Deborah; Perea, Jaime; D'Onofrio, Mauro

    2017-12-01

    We present preliminary results of a spectroscopic analysis for a sample of type 1 highly accreting quasars (LLedd>0.2) at high redshift, z 2-3. The quasars were observed with the OSIRIS spectrograph on the GTC 10.4 m telescope located at the Observatorio del Roque de los Muchachos in La Palma. The highly accreting quasars were identified using the 4D Eigenvector 1 formalism, which is able to organize type 1 quasars over a broad range of redshift and luminosity. The kinematic and physical properties of the broad line region have been derived by fitting the profiles of strong UV emission lines such as AlIII, SiIII and CIII. The majority of our sources show strong blueshifts in the high-ionization lines and high Eddington ratios which are related with the productions of outflows. The importance of highly accreting quasars goes beyond a detailed understanding of their physics: their extreme Eddington ratio makes them candidates standard candles for cosmological studies.

  9. Software Applications on the Peregrine System | High-Performance Computing

    Science.gov (United States)

    Algebraic Modeling System (GAMS) Statistics and analysis High-level modeling system for mathematical reactivity. Gurobi Optimizer Statistics and analysis Solver for mathematical programming LAMMPS Chemistry and , reactivities, and vibrational, electronic and NMR spectra. R Statistical Computing Environment Statistics and

  10. Decay modes of high-lying excitations in nuclei

    International Nuclear Information System (INIS)

    Gales, S.

    1993-01-01

    Inelastic, charge-exchange and transfer reactions induced by hadronic probes at intermediate energies have revealed a rich spectrum of new high-lying modes embedded in the nuclear continuum. The investigation of their decay properties is believed to be a severe test of their microscopic structure as predicted by nuclear models. In addition the degree of damping of these simple modes in the nuclear continuum can be obtained by means of the measured branching ratios to the various decay channels as compared to statistical model calculations. As illustrative examples the decay modes of high-spin single-particle states and isovector resonances are discussed. (author) 23 refs.; 14 figs

  11. High assurance SPIRAL

    Science.gov (United States)

    Franchetti, Franz; Sandryhaila, Aliaksei; Johnson, Jeremy R.

    2014-06-01

    In this paper we introduce High Assurance SPIRAL to solve the last mile problem for the synthesis of high assurance implementations of controllers for vehicular systems that are executed in today's and future embedded and high performance embedded system processors. High Assurance SPIRAL is a scalable methodology to translate a high level specification of a high assurance controller into a highly resource-efficient, platform-adapted, verified control software implementation for a given platform in a language like C or C++. High Assurance SPIRAL proves that the implementation is equivalent to the specification written in the control engineer's domain language. Our approach scales to problems involving floating-point calculations and provides highly optimized synthesized code. It is possible to estimate the available headroom to enable assurance/performance trade-offs under real-time constraints, and enables the synthesis of multiple implementation variants to make attacks harder. At the core of High Assurance SPIRAL is the Hybrid Control Operator Language (HCOL) that leverages advanced mathematical constructs expressing the controller specification to provide high quality translation capabilities. Combined with a verified/certified compiler, High Assurance SPIRAL provides a comprehensive complete solution to the efficient synthesis of verifiable high assurance controllers. We demonstrate High Assurance SPIRALs capability by co-synthesizing proofs and implementations for attack detection and sensor spoofing algorithms and deploy the code as ROS nodes on the Landshark unmanned ground vehicle and on a Synthetic Car in a real-time simulator.

  12. High concentration agglomerate dynamics at high temperatures.

    Science.gov (United States)

    Heine, M C; Pratsinis, S E

    2006-11-21

    The dynamics of agglomerate aerosols are investigated at high solids concentrations that are typical in industrial scale manufacture of fine particles (precursor mole fraction larger than 10 mol %). In particular, formation and growth of fumed silica at such concentrations by chemical reaction, coagulation, and sintering is simulated at nonisothermal conditions and compared to limited experimental data and commercial product specifications. Using recent chemical kinetics for silica formation by SiCl4 hydrolysis and neglecting aerosol polydispersity, the evolution of the diameter of primary particles (specific surface area, SSA), hard- and soft-agglomerates, along with agglomerate effective volume fraction (volume occupied by agglomerate) is investigated. Classic Smoluchowski theory is fundamentally limited for description of soft-agglomerate Brownian coagulation at high solids concentrations. In fact, these high concentrations affect little the primary particle diameter (or SSA) but dominate the soft-agglomerate diameter, structure, and volume fraction, leading to gelation consistent with experimental data. This indicates that restructuring and fragmentation should affect product particle characteristics during high-temperature synthesis of nanostructured particles at high concentrations in aerosol flow reactors.

  13. High-frequency, high-intensity photoionization

    Science.gov (United States)

    Reiss, H. R.

    1996-02-01

    Two analytical methods for computing ionization by high-frequency fields are compared. Predicted ionization rates compare well, but energy predictions for the onset of ionization differ radically. The difference is shown to arise from the use of a transformation in one of the methods that alters the zero from which energy is measured. This alteration leads to an apparent energy threshold for ionization that can, especially in the stabilization regime, differ strongly from the laboratory measurement. It is concluded that channel closings in intense-field ionization can occur at high as well as low frequencies. It is also found that the stabilization phenomenon at high frequencies, very prominent for hydrogen, is absent in a short-range potential.

  14. Western Canada: high prices, high activity

    International Nuclear Information System (INIS)

    Savidant, S

    2000-01-01

    The forces responsible for the high drilling and exploration activity in Western Canada (recent high prices, excess pipeline capacity, and the promise of as yet undiscovered natural gas resources) are discussed. Supply and demand signposts, among them weather impacts, political response by governments, the high demand for rigs and services, the intense competition for land, the scarcity of qualified human resources, are reviewed/. The geological potential of Western Canada, the implications of falling average pool sizes, the industry's ability to catch up to increasing declines, are explored. The disappearance of easy large discoveries, rising development costs involved in smaller, more complex hence more expensive pools are assessed and the Canadian equity and capital markets are reviewed. The predicted likely outcome of all the above factors is fewer players, increasing expectation of higher returns, and more discipline among the remaining players

  15. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  16. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... symptoms include the following: High blood glucose High levels of sugar in the urine Frequent urination Increased ... you should check and what your blood glucose levels should be. Checking your blood and then treating ...

  17. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools ...

  18. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools To Know Your Risk Alert Day Diabetes Basics Home Symptoms Diagnosis America's Diabetes Challenge Type 1 Type 2 Facts About Type 2 Enroll ...

  19. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose ... Clinical Practice Guidelines Patient Education Materials Scientific Sessions Journals for Professionals Professional Books Patient Access to Research ...

  20. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose ... Diabetes Meal Plans Create Your Plate Gluten Free Diets Meal Planning for Vegetarian Diets Cook with Heart- ...

  1. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose ... Index Low-Calorie Sweeteners Sugar and Desserts Fitness Exercise & Type 1 Diabetes Get Started Safely Get And ...

  2. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood ... For Parents & Kids Safe at School Everyday Life Children and Type 2 Diabetes Know Your Rights Employment ...

  3. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High ... What Can I Drink? Fruit Dairy Food Tips Eating Out Quick Meal Ideas Snacks Nutrient Content Claims ...

  4. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools To Know Your Risk Alert Day Diabetes Basics Home Symptoms Diagnosis America's Diabetes Challenge Type ...

  5. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Know Your Rights Employment Discrimination Health Care Professionals Law Enforcement Driver's License For Lawyers Food & Fitness Home ... symptoms include the following: High blood glucose High levels of sugar in the urine Frequent urination Increased ...

  6. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Your Carbs Count Glycemic Index Low-Calorie Sweeteners Sugar and Desserts Fitness Exercise & Type 1 Diabetes Get ... the technical term for high blood glucose (blood sugar). High blood glucose happens when the body has ...

  7. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health ...

  8. Supersymmetry at high temperatures

    International Nuclear Information System (INIS)

    Das, A.; Kaku, M.

    1978-01-01

    We investigate the properties of Green's functions in a spontaneously broken supersymmetric model at high temperatures. We show that, even at high temperatures, we do not get restoration of supersymmetry, at least in the one-loop approximation

  9. High blood sugar

    Science.gov (United States)

    ... Alternative Names Hyperglycemia - self care; High blood glucose - self care; Diabetes - high blood sugar References American Diabetes Association. Standards of medical care in diabetes - 2017: 4. Lifestyle management and 6. Glycemic targets. Diabetes Care . 2017;40( ...

  10. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Diagnosing Diabetes and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor ...

  11. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools To ... Email: Sign Up Thank you for signing up ' + ' '); $('.survey-form').show(); }, success: function (data) { $('#survey-errors').remove(); $('. ...

  12. High blood pressure - children

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/007696.htm High blood pressure - children To use the sharing features on this page, please enable JavaScript. High blood pressure (hypertension) is an increase in the force of ...

  13. Preventing High Blood Pressure

    Science.gov (United States)

    ... Heart Disease Cholesterol Salt Million Hearts® WISEWOMAN Preventing High Blood Pressure: Healthy Living Habits Recommend on Facebook Tweet Share ... meal and snack options can help you avoid high blood pressure and its complications. Be sure to eat plenty ...

  14. High blood pressure - infants

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/007329.htm High blood pressure - infants To use the sharing features on this page, please enable JavaScript. High blood pressure (hypertension) is an increase in the force of ...

  15. High blood pressure medications

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/007484.htm High blood pressure medicines To use the sharing features on this page, please enable JavaScript. Treating high blood pressure will help prevent problems such as heart disease, ...

  16. High speed, High resolution terahertz spectrometers

    International Nuclear Information System (INIS)

    Kim, Youngchan; Yee, Dae Su; Yi, Miwoo; Ahn, Jaewook

    2008-01-01

    A variety of sources and methods have been developed for terahertz spectroscopy during almost two decades. Terahertz time domain spectroscopy (THz TDS)has attracted particular attention as a basic measurement method in the fields of THz science and technology. Recently, asynchronous optical sampling (AOS)THz TDS has been demonstrated, featuring rapid data acquisition and a high spectral resolution. Also, terahertz frequency comb spectroscopy (TFCS)possesses attractive features for high precision terahertz spectroscopy. In this presentation, we report on these two types of terahertz spectrometer. Our high speed, high resolution terahertz spectrometer is demonstrated using two mode locked femtosecond lasers with slightly different repetition frequencies without a mechanical delay stage. The repetition frequencies of the two femtosecond lasers are stabilized by use of two phase locked loops sharing the same reference oscillator. The time resolution of our terahertz spectrometer is measured using the cross correlation method to be 270 fs. AOS THz TDS is presented in Fig. 1, which shows a time domain waveform rapidly acquired on a 10ns time window. The inset shows a zoom into the signal with 100ps time window. The spectrum obtained by the fast Fourier Transformation (FFT)of the time domain waveform has a frequency resolution of 100MHz. The dependence of the signal to noise ratio (SNR)on the measurement time is also investigated

  17. High current and high power superconducting rectifiers

    International Nuclear Information System (INIS)

    Kate, H.H.J. ten; Bunk, P.B.; Klundert, L.J.M. van de; Britton, R.B.

    1981-01-01

    Results on three experimental superconducting rectifiers are reported. Two of them are 1 kA low frequency flux pumps, one thermally and magnetically switched. The third is a low-current high-frequency magnetically switched rectifier which can use the mains directly. (author)

  18. High energy neutron radiography

    International Nuclear Information System (INIS)

    Gavron, A.; Morley, K.; Morris, C.; Seestrom, S.; Ullmann, J.; Yates, G.; Zumbro, J.

    1996-01-01

    High-energy spallation neutron sources are now being considered in the US and elsewhere as a replacement for neutron beams produced by reactors. High-energy and high intensity neutron beams, produced by unmoderated spallation sources, open potential new vistas of neutron radiography. The authors discuss the basic advantages and disadvantages of high-energy neutron radiography, and consider some experimental results obtained at the Weapons Neutron Research (WNR) facility at Los Alamos

  19. High-pressure microbiology

    National Research Council Canada - National Science Library

    Michiels, Chris; Bartlett, Douglas Hoyt; Aertsen, Abram

    2008-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. High Hydrostatic Pressure Effects in the Biosphere: from Molecules to Microbiology * Filip Meersman and Karel Heremans . . . . . . . . . . . . 2. Effects...

  20. High resolution, high speed ultrahigh vacuum microscopy

    International Nuclear Information System (INIS)

    Poppa, Helmut

    2004-01-01

    The history and future of transmission electron microscopy (TEM) is discussed as it refers to the eventual development of instruments and techniques applicable to the real time in situ investigation of surface processes with high resolution. To reach this objective, it was necessary to transform conventional high resolution instruments so that an ultrahigh vacuum (UHV) environment at the sample site was created, that access to the sample by various in situ sample modification procedures was provided, and that in situ sample exchanges with other integrated surface analytical systems became possible. Furthermore, high resolution image acquisition systems had to be developed to take advantage of the high speed imaging capabilities of projection imaging microscopes. These changes to conventional electron microscopy and its uses were slowly realized in a few international laboratories over a period of almost 40 years by a relatively small number of researchers crucially interested in advancing the state of the art of electron microscopy and its applications to diverse areas of interest; often concentrating on the nucleation, growth, and properties of thin films on well defined material surfaces. A part of this review is dedicated to the recognition of the major contributions to surface and thin film science by these pioneers. Finally, some of the important current developments in aberration corrected electron optics and eventual adaptations to in situ UHV microscopy are discussed. As a result of all the path breaking developments that have led to today's highly sophisticated UHV-TEM systems, integrated fundamental studies are now possible that combine many traditional surface science approaches. Combined investigations to date have involved in situ and ex situ surface microscopies such as scanning tunneling microscopy/atomic force microscopy, scanning Auger microscopy, and photoemission electron microscopy, and area-integrating techniques such as x-ray photoelectron

  1. Journalism Beyond High School.

    Science.gov (United States)

    Turner, Sally

    2001-01-01

    Discusses the shift from high school journalism to college journalism for students. Describes the role of the high school journalism advisor in that process. Offers checklists for getting to know a college publication. Outlines ways high school journalism teachers can take advantage of journalism resources available at local colleges and…

  2. Evaluating High School IT

    Science.gov (United States)

    Thompson, Brett A.

    2004-01-01

    Since its inception in 1997, Cisco's curriculum has entered thousands of high schools across the U.S. and around the world for two reasons: (1) Cisco has a large portion of the computer networking market, and thus has the resources for and interest in developing high school academies; and (2) high school curriculum development teams recognize the…

  3. Early College High Schools

    Science.gov (United States)

    Dessoff, Alan

    2011-01-01

    For at-risk students who stand little chance of going to college, or even finishing high school, a growing number of districts have found a solution: Give them an early start in college while they still are in high school. The early college high school (ECHS) movement that began with funding from the Bill and Melinda Gates Foundation 10 years ago…

  4. High performance systems

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, M.B. [comp.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  5. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  6. High energy physics research

    International Nuclear Information System (INIS)

    Piroue, P.A.

    1992-10-01

    The goal of this research is to understand the fundamental constituents of matter and their interactions. At this time, the following activities are underway: e + e - interactions and Z 0 physics at CERN; studies to upgrade the L3 detector at LHC; very high statistics charm physics at Fermilab; search for the H particle at BNL; search for the fifth force; rare kaon decay experiments at BNL; study of B-meson physics at hadron colliders; e + e - pair creation by light at SLAC; R ampersand D related to SSC experiments and the GEM detector; and theoretical research in elementary particle physics and cosmology. The main additions to the activities described in detail in the original grant proposal are (1) an experiment at SLAC (E-144) to study strong-field QED effects in e-laser and γ-laser collisions, and (2) a search for the H particle at BNL (E-188). The R ampersand D efforts for the GEM detector have also considerably expanded. In this paper we give a brief status report for each activity currently under way

  7. High energy physics

    International Nuclear Information System (INIS)

    1992-01-01

    The Counter Group continues to work on data analysis for Fermilab Experiment E653. Altogether, they expect several thousand reconstructed charm events and approximately 25 B pair events of which 12 have been observed thus far. Preparation continue for Fermilab Experiment E781, a high statistics study of charm baryon production. In the Theory Group, Cutkosky and collaborators study hadron phenomenology and non-perturbative QCD calculations. Levine has a long standing program in computational QED to obtain improved theoretical values for g-2 of the electron. Wolfenstein, Li, and their collaborators have worked on areas of weak interaction phenomenology that may yield insights beyond the standard model, e.g. CP violation and non-zero neutrino masses. Holman has been concerned with phase transitions in gauge theories relevant to cosmological problems. During 1991 most of the group effort was concentrated on the L3 experiment at CERN. Highlights of the results from the analysis of the Z degrees resonance include (a) a measurement of the strong coupling constant α s for b quarks (b) a precision measurement of the average time of B hadrons and (c) a direct determination of the number of light neutrino faculties from the reaction e + e - → ν bar νγ. We also began a major upgrade of the L3 luminosity monitor by replacing PWC chamber by a Si strip system in front of the BGO calorimeters. Finally we have continued our SSC R ampersand D work on BaF 2 by joining the GEM collaboration

  8. Photoproduction at high energy and high intensity

    CERN Multimedia

    2002-01-01

    The photon beam used for this programme is tagged and provides a large flux up to very high energies (150-200 GeV). It is also hadron-free, since it is obtained by a two-step conversion method. A spectrometer is designed to exploit this beam and to perform a programme of photoproduction with a high level of sensitivity (5-50 events/picobarn).\\\\ \\\\ Priority will be given to the study of processes exhibiting the point-like behaviour of the photon, especially deep inelastic Compton scattering. The spectrometer has two magnets. Charged tracks are measured by MWPC's located only in field-free regions. Three calorimeters provide a large coverage for identifying and measuring electrons and photons. An iron filter downstream identifies muons. Most of the equipment is existing and recuperated from previous experiments.

  9. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  10. Decay properties of high-lying single-particles modes

    Energy Technology Data Exchange (ETDEWEB)

    Beaumel, D. [Institut de Physique Nucleaire, 91 - Orsay (France); Fortier, S. [Institut de Physique Nucleaire, 91 - Orsay (France); Gales, S. [Institut de Physique Nucleaire, 91 - Orsay (France); Guillot, J. [Institut de Physique Nucleaire, 91 - Orsay (France); Langevin-Joliot, H. [Institut de Physique Nucleaire, 91 - Orsay (France); Laurent, H. [Institut de Physique Nucleaire, 91 -Orsay (France); Maison, J.M. [Institut de Physique Nucleaire, 91 - Orsay (France); Vernotte, J. [Institut de Physique Nucleaire, 91 - Orsay (France); Bordewijck, J. [Kernfysisch Versneller Instituut, 9747 Groningen (Netherlands); Brandenburg, S. [Kernfysisch Versneller Instituut, 9747 Groningen (Netherlands); Krasznahorkay, A. [Kernfysisch Versneller Instituut, 9747 Groningen (Netherlands); Crawley, G.M. [NSCL, Michigan State University, East Lansing, MI 48824 (United States); Massolo, C.P. [Universitad Nacional de La Plata, 1900 La Plata (Argentina); Renteria, M. [Universitad Nacional de La Plata, 1900 La Plata (Argentina); Khendriche, A. [University of Tizi-Ouzou, Tizi-Ouzou (Algeria)

    1996-03-18

    The neutron decay of high-lying single-particle states in {sup 64}Ni, {sup 90}Zr, {sup 120}Sn and {sup 208}Pb excited by means of the ({alpha},{sup 3}He) reaction has been investigated at 120 MeV incident energy using the multidetector EDEN. The characteristics of this reaction are studied using inclusive spectra and angular correlation analysis. The structure located between 11 and 15 MeV in {sup 91}Zr, and between 8 and 12 MeV excitation energy in {sup 209}Pb display large departures from a pure statistical decay. The corresponding non-statistical branching ratios are compared with the results of two theoretical calculations. (orig.).

  11. Technological Aspects: High Voltage

    International Nuclear Information System (INIS)

    Faircloth, D C

    2013-01-01

    This paper covers the theory and technological aspects of high-voltage design for ion sources. Electric field strengths are critical to understanding high-voltage breakdown. The equations governing electric fields and the techniques to solve them are discussed. The fundamental physics of high-voltage breakdown and electrical discharges are outlined. Different types of electrical discharges are catalogued and their behaviour in environments ranging from air to vacuum are detailed. The importance of surfaces is discussed. The principles of designing electrodes and insulators are introduced. The use of high-voltage platforms and their relation to system design are discussed. The use of commercially available high-voltage technology such as connectors, feedthroughs and cables are considered. Different power supply technologies and their procurement are briefly outlined. High-voltage safety, electric shocks and system design rules are covered. (author)

  12. Technological Aspects: High Voltage

    CERN Document Server

    Faircloth, D.C.

    2013-12-16

    This paper covers the theory and technological aspects of high-voltage design for ion sources. Electric field strengths are critical to understanding high-voltage breakdown. The equations governing electric fields and the techniques to solve them are discussed. The fundamental physics of high-voltage breakdown and electrical discharges are outlined. Different types of electrical discharges are catalogued and their behaviour in environments ranging from air to vacuum are detailed. The importance of surfaces is discussed. The principles of designing electrodes and insulators are introduced. The use of high-voltage platforms and their relation to system design are discussed. The use of commercially available high-voltage technology such as connectors, feedthroughs and cables are considered. Different power supply technologies and their procurement are briefly outlined. High-voltage safety, electric shocks and system design rules are covered.

  13. High-dimensional covariance estimation with high-dimensional data

    CERN Document Server

    Pourahmadi, Mohsen

    2013-01-01

    Methods for estimating sparse and large covariance matrices Covariance and correlation matrices play fundamental roles in every aspect of the analysis of multivariate data collected from a variety of fields including business and economics, health care, engineering, and environmental and physical sciences. High-Dimensional Covariance Estimation provides accessible and comprehensive coverage of the classical and modern approaches for estimating covariance matrices as well as their applications to the rapidly developing areas lying at the intersection of statistics and mac

  14. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    Can prefabrication contribute to the development of high performance homes? To answer this question, this chapter defines high performance in more broadly inclusive terms, acknowledging the technical, architectural, social and economic conditions under which energy consumption and production occur....... Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  15. High temperature refrigerator

    International Nuclear Information System (INIS)

    Steyert, W.A. Jr.

    1978-01-01

    A high temperature magnetic refrigerator is described which uses a Stirling-like cycle in which rotating magnetic working material is heated in zero field and adiabatically magnetized, cooled in high field, then adiabatically demagnetized. During this cycle the working material is in heat exchange with a pumped fluid which absorbs heat from a low temperature heat source and deposits heat in a high temperature reservoir. The magnetic refrigeration cycle operates at an efficiency 70% of Carnot

  16. High-Risk List

    Science.gov (United States)

    2017-01-01

    economy. The World Bank has said that “corruption creates an unfavorable business environment by undermining the operation efficiency of firms and... Bank Began as ‘Ponzi Scheme,’” 11/27/2012. 64 Independent Joint Anti-Corruption Monitoring and Evaluation Committee, Unfinished Business : The Follow...HIGH RISK AREA 7: Oversight 51 HIGH-RISK AREA 8: Strategy and Planning 55 CONCLUSION HIGH RISK LIST I JANUARY 11, 2017 2 EXECUTIVE SUMMARY

  17. Athletes at High Altitude.

    Science.gov (United States)

    Khodaee, Morteza; Grothe, Heather L; Seyfert, Jonathan H; VanBaak, Karin

    2016-01-01

    Athletes at different skill levels perform strenuous physical activity at high altitude for a variety of reasons. Multiple team and endurance events are held at high altitude and may place athletes at increased risk for developing acute high altitude illness (AHAI). Training at high altitude has been a routine part of preparation for some of the high level athletes for a long time. There is a general belief that altitude training improves athletic performance for competitive and recreational athletes. A review of relevant publications between 1980 and 2015 was completed using PubMed and Google Scholar. Clinical review. Level 3. AHAI is a relatively uncommon and potentially serious condition among travelers to altitudes above 2500 m. The broad term AHAI includes several syndromes such as acute mountain sickness (AMS), high altitude pulmonary edema (HAPE), and high altitude cerebral edema (HACE). Athletes may be at higher risk for developing AHAI due to faster ascent and more vigorous exertion compared with nonathletes. Evidence regarding the effects of altitude training on athletic performance is weak. The natural live high, train low altitude training strategy may provide the best protocol for enhancing endurance performance in elite and subelite athletes. High altitude sports are generally safe for recreational athletes, but they should be aware of their individual risks. Individualized and appropriate acclimatization is an essential component of injury and illness prevention.

  18. High voltage engineering

    CERN Document Server

    Rizk, Farouk AM

    2014-01-01

    Inspired by a new revival of worldwide interest in extra-high-voltage (EHV) and ultra-high-voltage (UHV) transmission, High Voltage Engineering merges the latest research with the extensive experience of the best in the field to deliver a comprehensive treatment of electrical insulation systems for the next generation of utility engineers and electric power professionals. The book offers extensive coverage of the physical basis of high-voltage engineering, from insulation stress and strength to lightning attachment and protection and beyond. Presenting information critical to the design, selec

  19. High enthalpy gas dynamics

    CERN Document Server

    Rathakrishnan, Ethirajan

    2014-01-01

    This is an introductory level textbook which explains the elements of high temperature and high-speed gas dynamics. written in a clear and easy to follow style, the author covers all the latest developments in the field including basic thermodynamic principles, compressible flow regimes and waves propagation in one volume covers theoretical modeling of High Enthalpy Flows, with particular focus on problems in internal and external gas-dynamic flows, of interest in the fields of rockets propulsion and hypersonic aerodynamics High enthalpy gas dynamics is a compulsory course for aerospace engine

  20. High blood cholesterol levels

    Science.gov (United States)

    Cholesterol - high; Lipid disorders; Hyperlipoproteinemia; Hyperlipidemia; Dyslipidemia; Hypercholesterolemia ... There are many types of cholesterol. The ones talked about most are: ... lipoprotein (HDL) cholesterol -- often called "good" cholesterol ...

  1. High voltage systems

    International Nuclear Information System (INIS)

    Martin, M.

    1991-01-01

    Industrial processes usually require electrical power. This power is used to drive motors, to heat materials, or in electrochemical processes. Often the power requirements of a plant require the electric power to be delivered at high voltage. In this paper high voltage is considered any voltage over 600 V. This voltage could be as high as 138,000 V for some very large facilities. The characteristics of this voltage and the enormous amounts of power being transmitted necessitate special safety considerations. Safety must be considered during the four activities associated with a high voltage electrical system. These activities are: Design; Installation; Operation; and Maintenance

  2. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  3. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... around 4:00 a.m. to 5:00 a.m.). What are the Symptoms of Hyperglycemia? The signs and symptoms include the following: High blood glucose High levels of sugar in the urine Frequent urination Increased ...

  4. High energy hadron scattering

    International Nuclear Information System (INIS)

    Johnson, R.C.

    1980-01-01

    High energy and small momentum transfer 2 'yields' 2 hadronic scattering processes are described in the physical framework of particle exchange. Particle production in high energy collisions is considered with emphasis on the features of inclusive reactions though with some remarks on exclusive processes. (U.K.)

  5. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... A A A Listen En Español Hyperglycemia (High Blood Glucose) Hyperglycemia is the technical term for high ... function (data) { $('#survey-errors').remove(); $('.survey-form .form-group .survey-alert-wrap').remove(); if (data.submitSurveyResponse.success == ' ...

  6. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor ... Chat Closed engagement en -- Have Type 2 Diabetes? - 2017-03-lwt2d-en.html Have Type 2 Diabetes? ...

  7. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Research & Practice Ways to Give Close Are You at Risk? Home Prevention Diagnosing Diabetes and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose ...

  8. High-Conflict Divorce.

    Science.gov (United States)

    Johnston, Janet R.

    1994-01-01

    Reviews available research studies of high-conflict divorce and its effects on children. Factors believed to contribute to high-conflict divorce are explored, and a model of their interrelationships is proposed. Dispute resolution, intervention, and prevention programs are discussed, and implications for social policy are outlined. (SLD)

  9. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... You at Risk? Home Prevention Diagnosing Diabetes and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools To Know Your Risk Alert Day Diabetes Basics ...

  10. High Blood Pressure (Hypertension)

    Science.gov (United States)

    ... other risk factors, like diabetes, you may need treatment. How does high blood pressure affect pregnant women? A few women will get ... HIV, Birth Control Heart Health for Women Pregnancy Menopause More Women's Health ... High Blood Pressure--Medicines to Help You Women and Diabetes Heart ...

  11. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High ... Holiday Meal Planning What Can I Eat? Making Healthy Food Choices Diabetes ... Tips Eating Out Quick Meal Ideas Snacks Nutrient Content Claims ...

  12. The high energy galaxy

    International Nuclear Information System (INIS)

    Cesarsky, C.J.

    1986-08-01

    The galaxy is host to a wide variety of high energy events. I review here recent results on large scale galactic phenomena: cosmic-ray origin and confinement, the connexion to ultra high energy gamma-ray emission from X-ray binaries, gamma ray and synchrotron emission in interstellar space, galactic soft and hard X-ray emission

  13. Highly Skilled Migrants

    DEFF Research Database (Denmark)

    Hvidt, Martin

    2016-01-01

    . It is pointed out that while the system facilitated speedy entry to the job market, the lack of inclusion in the Gulf economies of the migrants, the lack of long-term prospects of residing in the countries and the highly asymmetric power balance between sponsor and migrant, provides few incentives...... for the highly skilled migrants to fully contribute to the Gulf economies....

  14. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Risk Healthy Eating Overweight Smoking High Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools ... Complications DKA (Ketoacidosis) & Ketones Kidney Disease ... than planned or exercised less than planned. You have stress from an illness, such as a cold or flu. You have ...

  15. Fixing High Schools

    Science.gov (United States)

    Perkins-Gough, Deborah

    2005-01-01

    Reports from national education organizations in the US indicate the sorry state of high schools in the country that are accused of failing to adequately prepare their graduates for college or for the workforce, highlighting what is a serious problem in light of the troubled state of the US economy. The need to improve high schools is urgent and…

  16. High coking value pitch

    Science.gov (United States)

    Miller, Douglas J.; Chang, Ching-Feng; Lewis, Irwin C.; Lewis, Richard T.

    2014-06-10

    A high coking value pitch prepared from coal tar distillate and has a low softening point and a high carbon value while containing substantially no quinoline insolubles is disclosed. The pitch can be used as an impregnant or binder for producing carbon and graphite articles.

  17. High Gravity (g) Combustion

    National Research Council Canada - National Science Library

    Zelina, Joseph

    2006-01-01

    .... The Ultra-Compact Combustor (UCC), a novel design based on trapped-vortex combustor (TVC) work that uses high swirl in a circumferential cavity to enhance reaction rates via high cavity g-loading on the order of 3000 g's...

  18. Very high energy colliders

    International Nuclear Information System (INIS)

    Richter, B.

    1986-03-01

    The luminosity and energy requirements are considered for both proton colliders and electron-positron colliders. Some of the basic design equations for high energy linear electron colliders are summarized, as well as design constraints. A few examples are given of parameters for very high energy machines. 4 refs., 6 figs

  19. High-pressure apparatus

    NARCIS (Netherlands)

    Schepdael, van L.J.M.; Bartels, P.V.; Berg, van den R.W.

    1999-01-01

    The invention relates to a high-pressure device (1) having a cylindrical high-pressure vessel (3) and prestressing means in order to exert an axial pressure on the vessel. The vessel (3) can have been formed from a number of layers of composite material, such as glass, carbon or aramide fibers which

  20. High-pressure crystallography

    Science.gov (United States)

    Katrusiak, A.

    2008-01-01

    The history and development of high-pressure crystallography are briefly described and examples of structural transformations in compressed compounds are given. The review is focused on the diamond-anvil cell, celebrating its 50th anniversary this year, the principles of its operation and the impact it has had on high-pressure X-ray diffraction.

  1. Proxmox high availability

    CERN Document Server

    Cheng, Simon MC

    2014-01-01

    If you want to know the secrets of virtualization and how to implement high availability on your services, this is the book for you. For those of you who are already using Proxmox, this book offers you the chance to build a high availability cluster with a distributed filesystem to further protect your system from failure.

  2. A method for screening active components from Chinese herbs by cell membrane chromatography-offline-high performance liquid chromatography/mass spectrometry and an online statistical tool for data processing.

    Science.gov (United States)

    Cao, Yan; Wang, Shaozhan; Li, Yinghua; Chen, Xiaofei; Chen, Langdong; Wang, Dongyao; Zhu, Zhenyu; Yuan, Yongfang; Lv, Diya

    2018-03-09

    Cell membrane chromatography (CMC) has been successfully applied to screen bioactive compounds from Chinese herbs for many years, and some offline and online two-dimensional (2D) CMC-high performance liquid chromatography (HPLC) hyphenated systems have been established to perform screening assays. However, the requirement of sample preparation steps for the second-dimensional analysis in offline systems and the need for an interface device and technical expertise in the online system limit their extensive use. In the present study, an offline 2D CMC-HPLC analysis combined with the XCMS (various forms of chromatography coupled to mass spectrometry) Online statistical tool for data processing was established. First, our previously reported online 2D screening system was used to analyze three Chinese herbs that were reported to have potential anti-inflammatory effects, and two binding components were identified. By contrast, the proposed offline 2D screening method with XCMS Online analysis was applied, and three more ingredients were discovered in addition to the two compounds revealed by the online system. Then, cross-validation of the three compounds was performed, and they were confirmed to be included in the online data as well, but were not identified there because of their low concentrations and lack of credible statistical approaches. Last, pharmacological experiments showed that these five ingredients could inhibit IL-6 release and IL-6 gene expression on LPS-induced RAW cells in a dose-dependent manner. Compared with previous 2D CMC screening systems, this newly developed offline 2D method needs no sample preparation steps for the second-dimensional analysis, and it is sensitive, efficient, and convenient. It will be applicable in identifying active components from Chinese herbs and practical in discovery of lead compounds derived from herbs. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Feasibility Study of Using Gemstone Spectral Imaging (GSI) and Adaptive Statistical Iterative Reconstruction (ASIR) for Reducing Radiation and Iodine Contrast Dose in Abdominal CT Patients with High BMI Values.

    Science.gov (United States)

    Zhu, Zheng; Zhao, Xin-ming; Zhao, Yan-feng; Wang, Xiao-yi; Zhou, Chun-wu

    2015-01-01

    To prospectively investigate the effect of using Gemstone Spectral Imaging (GSI) and adaptive statistical iterative reconstruction (ASIR) for reducing radiation and iodine contrast dose in abdominal CT patients with high BMI values. 26 patients (weight > 65kg and BMI ≥ 22) underwent abdominal CT using GSI mode with 300mgI/kg contrast material as study group (group A). Another 21 patients (weight ≤ 65kg and BMI ≥ 22) were scanned with a conventional 120 kVp tube voltage for noise index (NI) of 11 with 450mgI/kg contrast material as control group (group B). GSI images were reconstructed at 60keV with 50%ASIR and the conventional 120kVp images were reconstructed with FBP reconstruction. The CT values, standard deviation (SD), signal-noise-ratio (SNR), contrast-noise-ratio (CNR) of 26 landmarks were quantitatively measured and image quality qualitatively assessed using statistical analysis. As for the quantitative analysis, the difference of CNR between groups A and B was all significant except for the mesenteric vein. The SNR in group A was higher than B except the mesenteric artery and splenic artery. As for the qualitative analysis, all images had diagnostic quality and the agreement for image quality assessment between the reviewers was substantial (kappa = 0.684). CT dose index (CTDI) values for non-enhanced, arterial phase and portal phase in group A were decreased by 49.04%, 40.51% and 40.54% compared with group B (P = 0.000), respectively. The total dose and the injection rate for the contrast material were reduced by 14.40% and 14.95% in A compared with B. The use of GSI and ASIR provides similar enhancement in vessels and image quality with reduced radiation dose and contrast dose, compared with the use of conventional scan protocol.

  4. Feasibility Study of Using Gemstone Spectral Imaging (GSI and Adaptive Statistical Iterative Reconstruction (ASIR for Reducing Radiation and Iodine Contrast Dose in Abdominal CT Patients with High BMI Values.

    Directory of Open Access Journals (Sweden)

    Zheng Zhu

    Full Text Available To prospectively investigate the effect of using Gemstone Spectral Imaging (GSI and adaptive statistical iterative reconstruction (ASIR for reducing radiation and iodine contrast dose in abdominal CT patients with high BMI values.26 patients (weight > 65kg and BMI ≥ 22 underwent abdominal CT using GSI mode with 300mgI/kg contrast material as study group (group A. Another 21 patients (weight ≤ 65kg and BMI ≥ 22 were scanned with a conventional 120 kVp tube voltage for noise index (NI of 11 with 450mgI/kg contrast material as control group (group B. GSI images were reconstructed at 60keV with 50%ASIR and the conventional 120kVp images were reconstructed with FBP reconstruction. The CT values, standard deviation (SD, signal-noise-ratio (SNR, contrast-noise-ratio (CNR of 26 landmarks were quantitatively measured and image quality qualitatively assessed using statistical analysis.As for the quantitative analysis, the difference of CNR between groups A and B was all significant except for the mesenteric vein. The SNR in group A was higher than B except the mesenteric artery and splenic artery. As for the qualitative analysis, all images had diagnostic quality and the agreement for image quality assessment between the reviewers was substantial (kappa = 0.684. CT dose index (CTDI values for non-enhanced, arterial phase and portal phase in group A were decreased by 49.04%, 40.51% and 40.54% compared with group B (P = 0.000, respectively. The total dose and the injection rate for the contrast material were reduced by 14.40% and 14.95% in A compared with B.The use of GSI and ASIR provides similar enhancement in vessels and image quality with reduced radiation dose and contrast dose, compared with the use of conventional scan protocol.

  5. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  6. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. High Performance Marine Vessels

    CERN Document Server

    Yun, Liang

    2012-01-01

    High Performance Marine Vessels (HPMVs) range from the Fast Ferries to the latest high speed Navy Craft, including competition power boats and hydroplanes, hydrofoils, hovercraft, catamarans and other multi-hull craft. High Performance Marine Vessels covers the main concepts of HPMVs and discusses historical background, design features, services that have been successful and not so successful, and some sample data of the range of HPMVs to date. Included is a comparison of all HPMVs craft and the differences between them and descriptions of performance (hydrodynamics and aerodynamics). Readers will find a comprehensive overview of the design, development and building of HPMVs. In summary, this book: Focuses on technology at the aero-marine interface Covers the full range of high performance marine vessel concepts Explains the historical development of various HPMVs Discusses ferries, racing and pleasure craft, as well as utility and military missions High Performance Marine Vessels is an ideal book for student...

  9. High altitude illness

    Science.gov (United States)

    Hartman-Ksycińska, Anna; Kluz-Zawadzka, Jolanta; Lewandowski, Bogumił

    High-altitude illness is a result of prolonged high-altitude exposure of unacclimatized individuals. The illness is seen in the form of acute mountain sickness (AMS) which if not treated leads to potentially life-threatening high altitude pulmonary oedema and high-altitude cerebral oedema. Medical problems are caused by hypobaric hypoxia stimulating hypoxia-inducible factor (HIF) release. As a result, the central nervous system, circulation and respiratory system function impairment occurs. The most important factor in AMS treatment is acclimatization, withdrawing further ascent and rest or beginning to descent; oxygen supplementation, and pharmacological intervention, and, if available, a portable hyperbaric chamber. Because of the popularity of high-mountain sports and tourism better education of the population at risk is essential.

  10. Multidimensional high harmonic spectroscopy

    International Nuclear Information System (INIS)

    Bruner, Barry D; Soifer, Hadas; Shafir, Dror; Dudovich, Nirit; Serbinenko, Valeria; Smirnova, Olga

    2015-01-01

    High harmonic generation (HHG) has opened up a new frontier in ultrafast science where attosecond time resolution and Angstrom spatial resolution are accessible in a single measurement. However, reconstructing the dynamics under study is limited by the multiple degrees of freedom involved in strong field interactions. In this paper we describe a new class of measurement schemes for resolving attosecond dynamics, integrating perturbative nonlinear optics with strong-field physics. These approaches serve as a basis for multidimensional high harmonic spectroscopy. Specifically, we show that multidimensional high harmonic spectroscopy can measure tunnel ionization dynamics with high precision, and resolves the interference between multiple ionization channels. In addition, we show how multidimensional HHG can function as a type of lock-in amplifier measurement. Similar to multi-dimensional approaches in nonlinear optical spectroscopy that have resolved correlated femtosecond dynamics, multi-dimensional high harmonic spectroscopy reveals the underlying complex dynamics behind attosecond scale phenomena. (paper)

  11. Editor's Choice - High Annual Hospital Volume is Associated with Decreased in Hospital Mortality and Complication Rates Following Treatment of Abdominal Aortic Aneurysms: Secondary Data Analysis of the Nationwide German DRG Statistics from 2005 to 2013.

    Science.gov (United States)

    Trenner, Matthias; Kuehnl, Andreas; Salvermoser, Michael; Reutersberg, Benedikt; Geisbuesch, Sarah; Schmid, Volker; Eckstein, Hans-Henning

    2018-02-01

    The aim of this study was to analyse the association between annual hospital procedural volume and post-operative outcomes following repair of abdominal aortic aneurysms (AAA) in Germany. Data were extracted from nationwide Diagnosis Related Group (DRG) statistics provided by the German Federal Statistical Office. Cases with a diagnosis of AAA (ICD-10 GM I71.3, I71.4) and procedure codes for endovascular aortic repair (EVAR; OPS 5-38a.1*) or open aortic repair (OAR; OPS 5-38.45, 5-38.47) treated between 2005 and 2013 were included. Hospitals were empirically grouped to quartiles depending on the overall annual volume of AAA procedures. A multilevel multivariable regression model was applied to adjust for sex, medical risk, type of procedure, and type of admission. Primary outcome was in hospital mortality. Secondary outcomes were complications, use of blood products, and length of stay (LOS). The association between AAA volume and in hospital mortality was also estimated as a function of continuous volume. A total of 96,426 cases, of which 11,795 (12.6%) presented as ruptured (r)AAA, were treated in >700 hospitals (annual median: 501). The crude in hospital mortality was 3.3% after intact (i)AAA repair (OAR 5.3%; EVAR 1.7%). Volume was inversely associated with mortality after OAR and EVAR. Complication rates, LOS, and use of blood products were lower in high volume hospitals. After rAAA repair, crude mortality was 40.4% (OAR 43.2%; EVAR 27.4%). An inverse association between mortality and volume was shown for rAAA repair; the same accounts for the use of blood products. When considering volume as a continuous variate, an annual caseload of 75-100 elective cases was associated with the lowest mortality risk. In hospital mortality and complication rates following AAA repair are inversely associated with annual hospital volume. The use of blood products and the LOS are lower in high volume hospitals. A minimum annual case threshold for AAA procedures might improve

  12. High current high accuracy IGBT pulse generator

    International Nuclear Information System (INIS)

    Nesterov, V.V.; Donaldson, A.R.

    1995-05-01

    A solid state pulse generator capable of delivering high current triangular or trapezoidal pulses into an inductive load has been developed at SLAC. Energy stored in a capacitor bank of the pulse generator is switched to the load through a pair of insulated gate bipolar transistors (IGBT). The circuit can then recover the remaining energy and transfer it back to the capacitor bank without reversing the capacitor voltage. A third IGBT device is employed to control the initial charge to the capacitor bank, a command charging technique, and to compensate for pulse to pulse power losses. The rack mounted pulse generator contains a 525 μF capacitor bank. It can deliver 500 A at 900V into inductive loads up to 3 mH. The current amplitude and discharge time are controlled to 0.02% accuracy by a precision controller through the SLAC central computer system. This pulse generator drives a series pair of extraction dipoles

  13. Psoriasis and high blood pressure.

    Science.gov (United States)

    Salihbegovic, Eldina Malkic; Hadzigrahic, Nermina; Suljagic, Edin; Kurtalic, Nermina; Sadic, Sena; Zejcirovic, Alema; Mujacic, Almina

    2015-02-01

    Psoriasis is a chronic skin ailment which can be connected with an increased occurrence of other illnesses, including high blood pressure. A prospective study has been conducted which included 70 patients affected by psoriasis, both genders, older than 18 years. Average age being 47,14 (SD= ±15,41) years, from that there were 36 men or 51,43 and 34 women or 48,57%. Average duration of psoriasis was 15,52 (SD=±12,54) years. Frequency of high blood pressure in those affected by psoriasis was 54,28%. Average age of the patients with psoriasis and high blood pressure was 53,79 year (SD=±14,15) and average duration of psoriasis was 17,19 years (SD=±13,51). Average values of PASI score were 16,65. Increase in values of PASI score and high blood pressure were statistically highly related (r=0,36, p=0,0001). Psoriasis was related to high blood pressure and there was a correlation between the severity of psoriasis and high blood pressure.

  14. High redshift quasars and high metallicities

    Science.gov (United States)

    Ferland, Gary J.

    1997-01-01

    A large-scale code called Cloudy was designed to simulate non-equilibrium plasmas and predict their spectra. The goal was to apply it to studies of galactic and extragalactic emission line objects in order to reliably deduce abundances and luminosities. Quasars are of particular interest because they are the most luminous objects in the universe and the highest redshift objects that can be observed spectroscopically, and their emission lines can reveal the composition of the interstellar medium (ISM) of the universe when it was well under a billion years old. The lines are produced by warm (approximately 10(sup 4)K) gas with moderate to low density (n less than or equal to 10(sup 12) cm(sup -3)). Cloudy has been extended to include approximately 10(sup 4) resonance lines from the 495 possible stages of ionization of the lightest 30 elements, an extension that required several steps. The charge transfer database was expanded to complete the needed reactions between hydrogen and the first four ions and fit all reactions with a common approximation. Radiative recombination rate coefficients were derived for recombination from all closed shells, where this process should dominate. Analytical fits to Opacity Project (OP) and other recent photoionization cross sections were produced. Finally, rescaled OP oscillator strengths were used to compile a complete set of data for 5971 resonance lines. The major discovery has been that high redshift quasars have very high metallicities and there is strong evidence that the quasar phenomenon is associated with the birth of massive elliptical galaxies.

  15. High speed atom source

    International Nuclear Information System (INIS)

    Hoshino, Hitoshi.

    1990-01-01

    In a high speed atom source, since the speed is not identical between ions and electrons, no sufficient neutralizing effect for ionic rays due to the mixing of the ionic rays and the electron rays can be obtained failing to obtain high speed atomic rays at high density. In view of the above, a speed control means is disposed for equalizing the speed of ions forming ionic rays and the speed of electrons forming electron rays. Further, incident angle of the electron rays and/or ionic rays to a magnet or an electrode is made variable. As a result, the relative speed between the ions and the electrons to the processing direction is reduced to zero, in which the probability of association between the ions and the electrons due to the coulomb force is increased to improve the neutralizing efficiency to easily obtain fine and high density high speed electron rays. Further, by varying the incident angle, a track capable of obtaining an ideal mixing depending on the energy of the neutralized ionic rays is formed. Since the high speed electron rays has such high density, they can be irradiated easily to the minute region of the specimen. (N.H.)

  16. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  17. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  18. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. High performance germanium MOSFETs

    Energy Technology Data Exchange (ETDEWEB)

    Saraswat, Krishna [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)]. E-mail: saraswat@stanford.edu; Chui, Chi On [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Krishnamohan, Tejas [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Kim, Donghyun [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Nayfeh, Ammar [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Pethe, Abhijit [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States)

    2006-12-15

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO {sub x}N {sub y} ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin ({approx}2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices.

  20. High performance germanium MOSFETs

    International Nuclear Information System (INIS)

    Saraswat, Krishna; Chui, Chi On; Krishnamohan, Tejas; Kim, Donghyun; Nayfeh, Ammar; Pethe, Abhijit

    2006-01-01

    Ge is a very promising material as future channel materials for nanoscale MOSFETs due to its high mobility and thus a higher source injection velocity, which translates into higher drive current and smaller gate delay. However, for Ge to become main-stream, surface passivation and heterogeneous integration of crystalline Ge layers on Si must be achieved. We have demonstrated growth of fully relaxed smooth single crystal Ge layers on Si using a novel multi-step growth and hydrogen anneal process without any graded buffer SiGe layer. Surface passivation of Ge has been achieved with its native oxynitride (GeO x N y ) and high-permittivity (high-k) metal oxides of Al, Zr and Hf. High mobility MOSFETs have been demonstrated in bulk Ge with high-k gate dielectrics and metal gates. However, due to their smaller bandgap and higher dielectric constant, most high mobility materials suffer from large band-to-band tunneling (BTBT) leakage currents and worse short channel effects. We present novel, Si and Ge based heterostructure MOSFETs, which can significantly reduce the BTBT leakage currents while retaining high channel mobility, making them suitable for scaling into the sub-15 nm regime. Through full band Monte-Carlo, Poisson-Schrodinger and detailed BTBT simulations we show a dramatic reduction in BTBT and excellent electrostatic control of the channel, while maintaining very high drive currents in these highly scaled heterostructure DGFETs. Heterostructure MOSFETs with varying strained-Ge or SiGe thickness, Si cap thickness and Ge percentage were fabricated on bulk Si and SOI substrates. The ultra-thin (∼2 nm) strained-Ge channel heterostructure MOSFETs exhibited >4x mobility enhancements over bulk Si devices and >10x BTBT reduction over surface channel strained SiGe devices

  1. High current ion sources

    International Nuclear Information System (INIS)

    Brown, I.G.

    1989-06-01

    The concept of high current ion source is both relative and evolutionary. Within the domain of one particular kind of ion source technology a current of microamperers might be 'high', while in another area a current of 10 Amperes could 'low'. Even within the domain of a single ion source type, what is considered high current performance today is routinely eclipsed by better performance and higher current output within a short period of time. Within their fields of application, there is a large number of kinds of ion sources that can justifiably be called high current. Thus, as a very limited example only, PIGs, Freemen sources, ECR sources, duoplasmatrons, field emission sources, and a great many more all have their high current variants. High current ion beams of gaseous and metallic species can be generated in a number of different ways. Ion sources of the kind developed at various laboratories around the world for the production of intense neutral beams for controlled fusion experiments are used to form large area proton deuteron beams of may tens of Amperes, and this technology can be used for other applications also. There has been significant progress in recent years in the use of microwave ion sources for high current ion beam generation, and this method is likely to find wide application in various different field application. Finally, high current beams of metal ions can be produced using metal vapor vacuum arc ion source technology. After a brief consideration of high current ion source design concepts, these three particular methods are reviewed in this paper

  2. High flying physics

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Cosmic ray physicists have always had to aim high. In the constant search for interactions produced as close as possible to the immensely high primary particles entering the earth's atmosphere from outer space, they have installed experiments on high mountain peaks and flown detectors aloft in balloons. In these studies, there have been periodic sightings of remarkable configurations of secondary particles. These events, many of which bear exotic names like Centauro, Andromeda, Texas Lone Star, etc., frequently defy explanation in terms of conventional physics ideas and give a glimpse of what may lie beyond the behaviour seen so far under laboratory conditions

  3. High Field Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-12-15

    A Workshop was held in Frascati at the end of September under the title 'Generation of High Fields for Particle Acceleration to Very High Energies'. It was organized by the CERN Accelerator School, the European Committee for Future Accelerators (ECFA) and the Italian INFN and was a further stage in the exploratory moves towards new techniques of acceleration. Such techniques might become necessary to respond to the needs of high energy physics some decades from now when the application of conventional techniques will probably have reached their limits.

  4. High voltage test techniques

    CERN Document Server

    Kind, Dieter

    2001-01-01

    The second edition of High Voltage Test Techniques has been completely revised. The present revision takes into account the latest international developments in High Voltage and Measurement technology, making it an essential reference for engineers in the testing field.High Voltage Technology belongs to the traditional area of Electrical Engineering. However, this is not to say that the area has stood still. New insulating materials, computing methods and voltage levels repeatedly pose new problems or open up methods of solution; electromagnetic compatibility (EMC) or components and systems al

  5. High-density lipoprotein cholesterol: How High

    Directory of Open Access Journals (Sweden)

    G Rajagopal

    2012-01-01

    Full Text Available The high-density lipoprotein cholesterol (HDL-C is considered anti-atherogenic good cholesterol. It is involved in reverse transport of lipids. Epidemiological studies have found inverse relationship of HDL-C and coronary heart disease (CHD risk. When grouped according to HDL-C, subjects having HDL-C more than 60 mg/dL had lesser risk of CHD than those having HDL-C of 40-60 mg/dL, who in turn had lesser risk than those who had HDL-C less than 40 mg/dL. No upper limit for beneficial effect of HDL-C on CHD risk has been identified. The goals of treating patients with low HDL-C have not been firmly established. Though many drugs are known to improve HDL-C concentration, statins are proven to improve CHD risk and mortality. Cholesteryl ester transfer protein (CETP is involved in metabolism of HDL-C and its inhibitors are actively being screened for clinical utility. However, final answer is still awaited on CETP-inhibitors.

  6. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... blood glucose High levels of sugar in the urine Frequent urination Increased thirst Part of managing your ... glucose is above 240 mg/dl, check your urine for ketones. If you have ketones, do not ...

  7. High-Speed Photography

    International Nuclear Information System (INIS)

    Paisley, D.L.; Schelev, M.Y.

    1998-01-01

    The applications of high-speed photography to a diverse set of subjects including inertial confinement fusion, laser surgical procedures, communications, automotive airbags, lightning etc. are briefly discussed. (AIP) copyright 1998 Society of Photo-Optical Instrumentation Engineers

  8. High-tech entrepreneurship

    DEFF Research Database (Denmark)

    Bernasconi, Michel; Harris, Simon; Mønsted, Mette

    High-tech businesses form a crucial part of entrepreneurial activity - in some ways representing very typical examples of entrepreneurship, yet in some ways representing quite different challenges. The uncertainty in innovation and advanced technology makes it difficult to use conventional economic...... focuses on the blend of theory and practice needed to inform advanced entrepreneurship students of the specifics of high-tech start-ups. Key topics covered include: uncertainty and innovation; entrepreneurial finance; marketing technological innovations; and high-tech incubation management.......Edited by a multi-national team, it draws together leading writers and researchers from across Europe, and is therefore a must-read for all those involved in advanced entrepreneurship with specific interests in high-tech start-ups....

  9. High Blood Pressure

    Science.gov (United States)

    ... kidney disease, diabetes, or metabolic syndrome Read less Unhealthy lifestyle habits Unhealthy lifestyle habits can increase the risk of high blood pressure. These habits include: Unhealthy eating patterns, such as eating too much sodium ...

  10. High-power klystrons

    Science.gov (United States)

    Siambis, John G.; True, Richard B.; Symons, R. S.

    1994-05-01

    Novel emerging applications in advanced linear collider accelerators, ionospheric and atmospheric sensing and modification and a wide spectrum of industrial processing applications, have resulted in microwave tube requirements that call for further development of high power klystrons in the range from S-band to X-band. In the present paper we review recent progress in high power klystron development and discuss some of the issues and scaling laws for successful design. We also discuss recent progress in electron guns with potential grading electrodes for high voltage with short and long pulse operation via computer simulations obtained from the code DEMEOS, as well as preliminary experimental results. We present designs for high power beam collectors.

  11. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... breast cancer and AIDS combined. Your gift today will help us get closer to curing diabetes and ... blood and then treating high blood glucose early will help you avoid problems associated with hyperglycemia. How ...

  12. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High ... excused. 86 million Americans have prediabetes. Take the test. Know where you stand. sticky en -- Chef Ronaldo's ...

  13. High blood pressure - adults

    Science.gov (United States)

    ... pressure is found. This is called essential hypertension. High blood pressure that is caused by another medical condition or medicine you are taking is called secondary hypertension. Secondary hypertension may be due to: Chronic ...

  14. High potassium level

    Science.gov (United States)

    ... level is very high, or if you have danger signs, such as changes in an ECG . Emergency ... Seifter JL. Potassium disorders. In: Goldman L, Schafer AI, eds. Goldman-Cecil Medicine . 25th ed. Philadelphia, PA: ...

  15. Responsive design high performance

    CERN Document Server

    Els, Dewald

    2015-01-01

    This book is ideal for developers who have experience in developing websites or possess minor knowledge of how responsive websites work. No experience of high-level website development or performance tweaking is required.

  16. High Blood Pressure

    Science.gov (United States)

    ... factors Diabetes High blood pressure Family history Obesity Race/ethnicity Full list of causes and risk factors ... give Give monthly Memorials and tributes Donate a car Donate gently used items Stock donation Workplace giving ...

  17. High energy astrophysics

    International Nuclear Information System (INIS)

    Engel, A.R.

    1979-01-01

    High energy astrophysical research carried out at the Blackett Laboratory, Imperial College, London is reviewed. Work considered includes cosmic ray particle detection, x-ray astronomy, gamma-ray astronomy, gamma and x-ray bursts. (U.K.)

  18. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Blood Pressure Physical Activity High Blood Glucose My Health Advisor Tools To Know Your Risk Alert Day ... DKA (Ketoacidosis) & Ketones Kidney Disease (Nephropathy) Gastroparesis Mental Health Step On Up Treatment & Care Blood Glucose Testing ...

  19. High-Definition Medicine.

    Science.gov (United States)

    Torkamani, Ali; Andersen, Kristian G; Steinhubl, Steven R; Topol, Eric J

    2017-08-24

    The foundation for a new era of data-driven medicine has been set by recent technological advances that enable the assessment and management of human health at an unprecedented level of resolution-what we refer to as high-definition medicine. Our ability to assess human health in high definition is enabled, in part, by advances in DNA sequencing, physiological and environmental monitoring, advanced imaging, and behavioral tracking. Our ability to understand and act upon these observations at equally high precision is driven by advances in genome editing, cellular reprogramming, tissue engineering, and information technologies, especially artificial intelligence. In this review, we will examine the core disciplines that enable high-definition medicine and project how these technologies will alter the future of medicine. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... for Association Events Messaging Tools Recruiting Advocates Local Market Planning Training Webinars News & Events Advocacy News Call ... Care > Blood Glucose Testing Share: Print Page Text Size: A A A Listen En Español Hyperglycemia (High ...

  1. High Velocity Gas Gun

    Science.gov (United States)

    1988-01-01

    A video tape related to orbital debris research is presented. The video tape covers the process of loading a High Velocity Gas Gun and firing it into a mounted metal plate. The process is then repeated in slow motion.

  2. High resolution solar observations

    International Nuclear Information System (INIS)

    Title, A.

    1985-01-01

    Currently there is a world-wide effort to develop optical technology required for large diffraction limited telescopes that must operate with high optical fluxes. These developments can be used to significantly improve high resolution solar telescopes both on the ground and in space. When looking at the problem of high resolution observations it is essential to keep in mind that a diffraction limited telescope is an interferometer. Even a 30 cm aperture telescope, which is small for high resolution observations, is a big interferometer. Meter class and above diffraction limited telescopes can be expected to be very unforgiving of inattention to details. Unfortunately, even when an earth based telescope has perfect optics there are still problems with the quality of its optical path. The optical path includes not only the interior of the telescope, but also the immediate interface between the telescope and the atmosphere, and finally the atmosphere itself

  3. Landforms of High Mountains

    Directory of Open Access Journals (Sweden)

    Derek A. McDougall

    2016-05-01

    Full Text Available Reviewed: Landforms of High Mountains. By Alexander Stahr and Ewald Langenscheidt. Heidelberg, Germany: Springer, 2015. viii + 158 pp. US$ 129.99. Also available as an e-book. ISBN 978-3-642-53714-1.

  4. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High ... You at Risk? Diagnosis Lower Your Risk Risk Test Alert Day Prediabetes My Health Advisor Tools to ...

  5. Highly stretchable carbon aerogels.

    Science.gov (United States)

    Guo, Fan; Jiang, Yanqiu; Xu, Zhen; Xiao, Youhua; Fang, Bo; Liu, Yingjun; Gao, Weiwei; Zhao, Pei; Wang, Hongtao; Gao, Chao

    2018-02-28

    Carbon aerogels demonstrate wide applications for their ultralow density, rich porosity, and multifunctionalities. Their compressive elasticity has been achieved by different carbons. However, reversibly high stretchability of neat carbon aerogels is still a great challenge owing to their extremely dilute brittle interconnections and poorly ductile cells. Here we report highly stretchable neat carbon aerogels with a retractable 200% elongation through hierarchical synergistic assembly. The hierarchical buckled structures and synergistic reinforcement between graphene and carbon nanotubes enable a temperature-invariable, recoverable stretching elasticity with small energy dissipation (~0.1, 100% strain) and high fatigue resistance more than 10 6 cycles. The ultralight carbon aerogels with both stretchability and compressibility were designed as strain sensors for logic identification of sophisticated shape conversions. Our methodology paves the way to highly stretchable carbon and neat inorganic materials with extensive applications in aerospace, smart robots, and wearable devices.

  6. High Performance Macromolecular Material

    National Research Council Canada - National Science Library

    Forest, M

    2002-01-01

    .... In essence, most commercial high-performance polymers are processed through fiber spinning, following Nature and spider silk, which is still pound-for-pound the toughest liquid crystalline polymer...

  7. Physics and high technology

    International Nuclear Information System (INIS)

    Shao Liqin; Ma Junru.

    1992-01-01

    At present, the development of high technology has opened a new chapter in world's history of science and technology. This review describes the great impact of physics on high technology in six different fields (energy technology, new materials, information technology, biotechnology, space technology, and Ocean technology). It is shown that the new concepts and new methods created in physics and the special conditions and measurements established for physics researches not only deepen human's knowledge about nature but also point out new directions for engineering and technology. The achievements in physics have been more and more applied to high technology, while the development of high technology has explored some new research areas and raised many novel, important projects for physics. Therefore, it is important for us to strengthen the research on these major problems in physics

  8. High temperature battery. Hochtemperaturbatterie

    Energy Technology Data Exchange (ETDEWEB)

    Bulling, M.

    1992-06-04

    To prevent heat losses of a high temperature battery, it is proposed to make the incoming current leads in the area of their penetration through the double-walled insulating housing as thermal throttle, particularly spiral ones.

  9. High Energy Materials

    Indian Academy of Sciences (India)

    IAS Admin

    Propellants used in rockets, pyrotechnics used in festivities, explosives used for .... In World War II, Wernher von Braun designed the. V-2 rockets which were ... A. Solid Propellants. A solid propellant is made from low or diluted high explosives.

  10. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... your blood and then treating high blood glucose early will help you avoid problems associated with hyperglycemia. ... to detect hyperglycemia so you can treat it early — before it gets worse. If you're new ...

  11. High Plains Aquifer

    Data.gov (United States)

    Kansas Data Access and Support Center — These digital maps contain information on the altitude of the base, the extent, and the 1991 potentiometric surface (i.e. altitude of the water table) of the High...

  12. High Resolution Elevation Contours

    Data.gov (United States)

    Minnesota Department of Natural Resources — This dataset contains contours generated from high resolution data sources such as LiDAR. Generally speaking this data is 2 foot or less contour interval.

  13. Berkeley High-Resolution Ball

    International Nuclear Information System (INIS)

    Diamond, R.M.

    1984-10-01

    Criteria for a high-resolution γ-ray system are discussed. Desirable properties are high resolution, good response function, and moderate solid angle so as to achieve not only double- but triple-coincidences with good statistics. The Berkeley High-Resolution Ball involved the first use of bismuth germanate (BGO) for anti-Compton shield for Ge detectors. The resulting compact shield permitted rather close packing of 21 detectors around a target. In addition, a small central BGO ball gives the total γ-ray energy and multiplicity, as well as the angular pattern of the γ rays. The 21-detector array is nearly complete, and the central ball has been designed, but not yet constructed. First results taken with 9 detector modules are shown for the nucleus 156 Er. The complex decay scheme indicates a transition from collective rotation (prolate shape) to single- particle states (possibly oblate) near spin 30 h, and has other interesting features

  14. HIGH-ALTITUDE ILLNESS

    Directory of Open Access Journals (Sweden)

    Dwitya Elvira

    2015-05-01

    Full Text Available AbstrakHigh-altitude illness (HAI merupakan sekumpulan gejala paru dan otak yang terjadi pada orang yang baru pertama kali mendaki ke ketinggian. HAI terdiri dari acute mountain sickness (AMS, high-altitude cerebral edema (HACE dan high-altitude pulmonary edema (HAPE. Tujuan tinjauan pustaka ini adalah agar dokter dan wisatawan memahami risiko, tanda, gejala, dan pengobatan high-altitude illness. Perhatian banyak diberikan terhadap penyakit ini seiring dengan meningkatnya popularitas olahraga ekstrim (mendaki gunung tinggi, ski dan snowboarding dan adanya kemudahan serta ketersediaan perjalanan sehingga jutaan orang dapat terpapar bahaya HAI. Di Pherice, Nepal (ketinggian 4343 m, 43% pendaki mengalami gejala AMS. Pada studi yang dilakukan pada tempat wisata di resort ski Colorado, Honigman menggambarkan kejadian AMS 22% pada ketinggian 1850 m sampai 2750 m, sementara Dean menunjukkan 42% memiliki gejala pada ketinggian 3000 m. Aklimatisasi merupakan salah satu tindakan pencegahan yang dapat dilakukan sebelum pendakian, selain beberapa pengobatan seperti asetazolamid, dexamethasone, phosopodiestrase inhibitor, dan ginko biloba.Kata kunci: high-altitude illness, acute mountain sickness, edema cerebral, pulmonary edema AbstractHigh-altitude illness (HAI is symptoms of lung and brain that occurs in people who first climb to altitude. HAI includes acute mountain sickness (AMS, high-altitude cerebral edema (HACE and high altitude pulmonary edema (HAPE. The objective of this review was to understand the risks, signs, symptoms, and treatment of high-altitude illness. The attention was given to this disease due to the rising popularity of extreme sports (high mountain climbing, skiing and snowboarding and the ease and availability of the current travelling, almost each year, millions of people could be exposed to the danger of HAI. In Pherice, Nepal (altitude 4343 m, 43% of climbers have symptoms of AMS. Furthermore, in a study conducted at sites in

  15. High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Traian Oneţ

    2009-01-01

    Full Text Available The paper presents the last studies and researches accomplished in Cluj-Napoca related to high performance concrete, high strength concrete and self compacting concrete. The purpose of this paper is to raid upon the advantages and inconveniences when a particular concrete type is used. Two concrete recipes are presented, namely for the concrete used in rigid pavement for roads and another one for self-compacting concrete.

  16. Clustering at high redshifts

    International Nuclear Information System (INIS)

    Shaver, P.A.

    1986-01-01

    Evidence for clustering of and with high-redshift QSOs is discussed. QSOs of different redshifts show no clustering, but QSOs of similar redshifts appear to be clustered on a scale comparable to that of galaxies at the present epoch. In addition, spectroscopic studies of close pairs of QSOs indicate that QSOs are surrounded by a relatively high density of absorbing matter, possibly clusters of galaxies

  17. ANL high resolution injector

    International Nuclear Information System (INIS)

    Minehara, E.; Kutschera, W.; Hartog, P.D.; Billquist, P.

    1985-01-01

    The ANL (Argonne National Laboratory) high-resolution injector has been installed to obtain higher mass resolution and higher preacceleration, and to utilize effectively the full mass range of ATLAS (Argonne Tandem Linac Accelerator System). Preliminary results of the first beam test are reported briefly. The design and performance, in particular a high-mass-resolution magnet with aberration compensation, are discussed. 7 refs., 5 figs., 2 tabs

  18. High voltage engineering fundamentals

    CERN Document Server

    Kuffel, E; Hammond, P

    1984-01-01

    Provides a comprehensive treatment of high voltage engineering fundamentals at the introductory and intermediate levels. It covers: techniques used for generation and measurement of high direct, alternating and surge voltages for general application in industrial testing and selected special examples found in basic research; analytical and numerical calculation of electrostatic fields in simple practical insulation system; basic ionisation and decay processes in gases and breakdown mechanisms of gaseous, liquid and solid dielectrics; partial discharges and modern discharge detectors; and over

  19. High performance polymeric foams

    International Nuclear Information System (INIS)

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-01-01

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy

  20. Converting high boiling hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Terrisse, H; DuFour, L

    1929-02-12

    A process is given for converting high boiling hydrocarbons into low boiling hydrocarbons, characterized in that the high boiling hydrocarbons are heated to 200 to 500/sup 0/C in the presence of ferrous chloride and of such gases as hydrogen, water gas, and the like gases under a pressure of from 5 to 40 kilograms per square centimeter. The desulfurization of the hydrocarbons occurs simultaneously.

  1. High-pressure tritium

    International Nuclear Information System (INIS)

    Coffin, D.O.

    1976-01-01

    Some solutions to problems of compressing and containing tritium gas to 200 MPa at 700 0 K are discussed. The principal emphasis is on commercial compressors and high-pressure equipment that can be easily modified by the researcher for safe use with tritium. Experience with metal bellows and diaphragm compressors has been favorable. Selection of materials, fittings, and gauges for high-pressure tritium work is also reviewed briefly

  2. High energy positron imaging

    International Nuclear Information System (INIS)

    Chen Shengzu

    2003-01-01

    The technique of High Energy Positron Imaging (HEPI) is the new development and extension of Positron Emission Tomography (PET). It consists of High Energy Collimation Imaging (HECI), Dual Head Coincidence Detection Imaging (DHCDI) and Positron Emission Tomography (PET). We describe the history of the development and the basic principle of the imaging methods of HEPI in details in this paper. Finally, the new technique of the imaging fusion, which combined the anatomical image and the functional image together are also introduced briefly

  3. High speed heterostructure devices

    CERN Document Server

    Beer, Albert C; Willardson, R K; Kiehl, Richard A; Sollner, T C L Gerhard

    1994-01-01

    Volume 41 includes an in-depth review of the most important, high-speed switches made with heterojunction technology. This volume is aimed at the graduate student or working researcher who needs a broad overview andan introduction to current literature. Key Features * The first complete review of InP-based HFETs and complementary HFETs, which promise very low power and high speed * Offers a complete, three-chapter review of resonant tunneling * Provides an emphasis on circuits as well as devices.

  4. High Burnup Effects Program

    International Nuclear Information System (INIS)

    Barner, J.O.; Cunningham, M.E.; Freshley, M.D.; Lanning, D.D.

    1990-04-01

    This is the final report of the High Burnup Effects Program (HBEP). It has been prepared to present a summary, with conclusions, of the HBEP. The HBEP was an international, group-sponsored research program managed by Battelle, Pacific Northwest Laboratories (BNW). The principal objective of the HBEP was to obtain well-characterized data related to fission gas release (FGR) for light water reactor (LWR) fuel irradiated to high burnup levels. The HBEP was organized into three tasks as follows: Task 1 -- high burnup effects evaluations; Task 2 -- fission gas sampling; and Task 3 -- parameter effects study. During the course of the HBEP, a program that extended over 10 years, 82 fuel rods from a variety of sources were characterized, irradiated, and then examined in detail after irradiation. The study of fission gas release at high burnup levels was the principal objective of the program and it may be concluded that no significant enhancement of fission gas release at high burnup levels was observed for the examined rods. The rim effect, an as yet unquantified contributor to athermal fission gas release, was concluded to be the one truly high-burnup effect. Though burnup enhancement of fission gas release was observed to be low, a full understanding of the rim region and rim effect has not yet emerged and this may be a potential area of further research. 25 refs., 23 figs., 4 tabs

  5. Kilburn High Road Revisited

    Directory of Open Access Journals (Sweden)

    Cristina Capineri

    2016-07-01

    Full Text Available Drawing on John Agnew’s (1987 theoretical framework for the analysis of place (location, locale and sense of place and on Doreen Massey’s (1991 interpretation of Kilburn High Road (London, the contribution develops an analysis of the notion of place in the case study of Kilburn High Road by comparing the semantics emerging from Doreen Massey’s interpretation of Kilburn High Road in the late Nineties with those from a selection of noisy and unstructured volunteered geographic information collected from Flickr photos and Tweets harvested in 2014–2015. The comparison shows how sense of place is dynamic and changing over time and explores Kilburn High Road through the categories of location, locale and sense of place derived from the qualitative analysis of VGI content and annotations. The contribution shows how VGI can contribute to discovering the unique relationship between people and place which takes the form given by Doreen Massey to Kilburn High Road and then moves on to the many forms given by people experiencing Kilburn High Road through a photo, a Tweet or a simple narrative. Finally, the paper suggests that the analysis of VGI content can contribute to detect the relevant features of street life, from infrastructure to citizens’ perceptions, which should be taken into account for a more human-centered approach in planning or service management.

  6. High Gradient Accelerator Research

    International Nuclear Information System (INIS)

    Temkin, Richard

    2016-01-01

    The goal of the MIT program of research on high gradient acceleration is the development of advanced acceleration concepts that lead to a practical and affordable next generation linear collider at the TeV energy level. Other applications, which are more near-term, include accelerators for materials processing; medicine; defense; mining; security; and inspection. The specific goals of the MIT program are: • Pioneering theoretical research on advanced structures for high gradient acceleration, including photonic structures and metamaterial structures; evaluation of the wakefields in these advanced structures • Experimental research to demonstrate the properties of advanced structures both in low-power microwave cold test and high-power, high-gradient test at megawatt power levels • Experimental research on microwave breakdown at high gradient including studies of breakdown phenomena induced by RF electric fields and RF magnetic fields; development of new diagnostics of the breakdown process • Theoretical research on the physics and engineering features of RF vacuum breakdown • Maintaining and improving the Haimson / MIT 17 GHz accelerator, the highest frequency operational accelerator in the world, a unique facility for accelerator research • Providing the Haimson / MIT 17 GHz accelerator facility as a facility for outside users • Active participation in the US DOE program of High Gradient Collaboration, including joint work with SLAC and with Los Alamos National Laboratory; participation of MIT students in research at the national laboratories • Training the next generation of Ph. D. students in the field of accelerator physics.

  7. High Caloric Diet for ALS Patients: High Fat, High Carbohydrate or High Protein

    Directory of Open Access Journals (Sweden)

    Sarvin Sanaie

    2015-01-01

    Full Text Available ALS is a fatal motor neurodegenerative disease characterized by muscle atrophy and weakness, dysarthria, and dysphagia. The mean survival of ALS patients is three to five years, with 50% of those diagnosed dying within three years of onset (1. A multidisciplinary approach is crucial to set an appropriate plan for metabolic and nutritional support in ALS. Nutritional management incorporates a continuous assessment and implementation of dietary modifications throughout the duration of the disease. The nutritional and metabolic approaches to ALS should start when the diagnosis of ALS is made and should become an integral part of the continuous care to the patient, including nutritional surveillance, dietary counseling, management of dysphagia, and enteral nutrition when needed. Malnutrition and lean body mass loss are frequent findings in ALS patients necessitating comprehensive energy requirement assessment for these patients. Malnutrition is an independent prognostic factor for survival in ALS with a 7.7 fold increase in risk of death. Malnutrition is estimated to develop in one quarter to half of people with ALS (2. Adequate calorie and protein provision would diminish muscle loss in this vulnerable group of patients. Although appropriate amount of energy to be administered is yet to be established, high calorie diet is expected to be effective for potential improvement of survival; ALS patients do not normally receive adequate  intake of energy. A growing number of clinicians suspect that a high calorie diet implemented early in their disease may help people with ALS meet their increased energy needs and extend their survival. Certain high calorie supplements appear to be safe and well tolerated by people with ALS according to studies led by Universitäts klinikum Ulm's and, appear to stabilize body weight within 3 months. In a recent study by Wills et al., intake of high-carbohydrate low-fat supplements has been recommended in ALS patients (3

  8. Modeling High-Dimensional Multichannel Brain Signals

    KAUST Repository

    Hu, Lechuan; Fortin, Norbert J.; Ombao, Hernando

    2017-01-01

    aspects: first, there are major statistical and computational challenges for modeling and analyzing high-dimensional multichannel brain signals; second, there is no set of universally agreed measures for characterizing connectivity. To model multichannel

  9. High availability using virtualization

    International Nuclear Information System (INIS)

    Calzolari, Federico; Arezzini, Silvia; Ciampa, Alberto; Mazzoni, Enrico; Domenici, Andrea; Vaglini, Gigliola

    2010-01-01

    High availability has always been one of the main problems for a data center. Till now high availability was achieved by host per host redundancy, a highly expensive method in terms of hardware and human costs. A new approach to the problem can be offered by virtualization. Using virtualization, it is possible to achieve a redundancy system for all the services running on a data center. This new approach to high availability allows the running virtual machines to be distributed over a small number of servers, by exploiting the features of the virtualization layer: start, stop and move virtual machines between physical hosts. The 3RC system is based on a finite state machine, providing the possibility to restart each virtual machine over any physical host, or reinstall it from scratch. A complete infrastructure has been developed to install operating system and middleware in a few minutes. To virtualize the main servers of a data center, a new procedure has been developed to migrate physical to virtual hosts. The whole Grid data center SNS-PISA is running at the moment in virtual environment under the high availability system.

  10. High temperature storage loop :

    Energy Technology Data Exchange (ETDEWEB)

    Gill, David Dennis; Kolb, William J.

    2013-07-01

    A three year plan for thermal energy storage (TES) research was created at Sandia National Laboratories in the spring of 2012. This plan included a strategic goal of providing test capability for Sandia and for the nation in which to evaluate high temperature storage (>650ÀC) technology. The plan was to scope, design, and build a flow loop that would be compatible with a multitude of high temperature heat transfer/storage fluids. The High Temperature Storage Loop (HTSL) would be reconfigurable so that it was useful for not only storage testing, but also for high temperature receiver testing and high efficiency power cycle testing as well. In that way, HTSL was part of a much larger strategy for Sandia to provide a research and testing platform that would be integral for the evaluation of individual technologies funded under the SunShot program. DOEs SunShot program seeks to reduce the price of solar technologies to 6/kWhr to be cost competitive with carbon-based fuels. The HTSL project sought to provide evaluation capability for these SunShot supported technologies. This report includes the scoping, design, and budgetary costing aspects of this effort

  11. Cryogenic high current discharges

    International Nuclear Information System (INIS)

    Meierovich, B.E.

    1994-01-01

    Z-pinches formed from frozen deuterium fibers by a rapidly rising current have enhanced stability and high neutron yield. The efforts to understand the enhanced stability and neutron yield on the basis of classical picture of Bennett equilibrium of the current channel has not given satisfactory results. The traditional approach does not take into account the essential difference between the frozen deuterium fiber Z-pinches and the usual Z-pinches such as exploding wires or classical gas-puffed Z-pinches. The very low temperature of the fiber atoms (10 K), together with the rapidly rising current, result in the coexistence of a high current channel with unionized fiber atoms for a substantial period of time. This phenomena lasts during the risetime. This approach takes into account the difference of the breakdown in a dielectric deuterium fiber and the breakdown in a metallic wire. This difference is essential to the understanding of specific features of cryogenic high current discharges. Z-pinches in frozen deuterium fibers should be considered as a qualitatively new phenomenon on the boundary of cryogenic and high current physics. It is a start of a new branch in plasma physics: the physics of cryogenic high current discharges

  12. High Altitude and Heart

    Directory of Open Access Journals (Sweden)

    Murat Yalcin

    2011-04-01

    Full Text Available Nowadays, situations associated with high altitude such as mountaineering, aviation increasingly draw the attention of people. Gas pressure decreases and hypoxia is encountered when climbing higher. Physiological and pathological responses of human body to different heights are different. Therefore, physiological and pathological changes that may occur together with height and to know the clinical outcomes of these are important . Acute mountain sickness caused by high altitude and high altitude cerebral edema are preventable diseases with appropriate precautions. Atmospheric oxygen decreasing with height, initiates many adaptive mechanisms. These adaptation mechanisms and acclimatization vary widely among individuals because of reasons such as environmental factors, exercise and cold. High altitude causes different changes in the cardiovascular system with various mechanisms. Although normal individuals easily adapt to these changes, this situation can lead to undesirable results in people with heart disease. For this reason, it should be known the effective evaluation of the people with known heart disease before traveling to high altitude and the complications due to the changes with height and the recommendations can be made to these patients. [TAF Prev Med Bull 2011; 10(2.000: 211-222

  13. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... People Working to Stop Diabetes Common Terms Diabetes Statistics Infographics Living with Diabetes Home Recently Diagnosed Where ... Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics Living With Diabetes Recently Diagnosed ...

  14. Total mass difference statistics algorithm: a new approach to identification of high-mass building blocks in electrospray ionization Fourier transform ion cyclotron mass spectrometry data of natural organic matter.

    Science.gov (United States)

    Kunenkov, Erast V; Kononikhin, Alexey S; Perminova, Irina V; Hertkorn, Norbert; Gaspar, Andras; Schmitt-Kopplin, Philippe; Popov, Igor A; Garmash, Andrew V; Nikolaev, Evgeniy N

    2009-12-15

    The ultrahigh-resolution Fourier transform ion cyclotron resonance (FTICR) mass spectrum of natural organic matter (NOM) contains several thousand peaks with dozens of molecules matching the same nominal mass. Such a complexity poses a significant challenge for automatic data interpretation, in which the most difficult task is molecular formula assignment, especially in the case of heavy and/or multielement ions. In this study, a new universal algorithm for automatic treatment of FTICR mass spectra of NOM and humic substances based on total mass difference statistics (TMDS) has been developed and implemented. The algorithm enables a blind search for unknown building blocks (instead of a priori known ones) by revealing repetitive patterns present in spectra. In this respect, it differs from all previously developed approaches. This algorithm was implemented in designing FIRAN-software for fully automated analysis of mass data with high peak density. The specific feature of FIRAN is its ability to assign formulas to heavy and/or multielement molecules using "virtual elements" approach. To verify the approach, it was used for processing mass spectra of sodium polystyrene sulfonate (PSS, M(w) = 2200 Da) and polymethacrylate (PMA, M(w) = 3290 Da) which produce heavy multielement and multiply-charged ions. Application of TMDS identified unambiguously monomers present in the polymers consistent with their structure: C(8)H(7)SO(3)Na for PSS and C(4)H(6)O(2) for PMA. It also allowed unambiguous formula assignment to all multiply-charged peaks including the heaviest peak in PMA spectrum at mass 4025.6625 with charge state 6- (mass bias -0.33 ppm). Application of the TMDS-algorithm to processing data on the Suwannee River FA has proven its unique capacities in analysis of spectra with high peak density: it has not only identified the known small building blocks in the structure of FA such as CH(2), H(2), C(2)H(2)O, O but the heavier unit at 154.027 amu. The latter was

  15. Microstructure ion Nuclear Spectra at High Excitation

    International Nuclear Information System (INIS)

    Ericson, T.E.O.

    1969-01-01

    The statistical microstructure of highly excited systems is illustrated by the distribution and fluctuations of levels, widths and cross-sections of nuclei both for the case of sharp resonances and the continuum case. The coexistence of simple modes of excitation with statistical effects in terms of strength functions is illustrated by isobaric analogue states. The analogy is made with similar phenomena for coherent light, is solid-state physics and high-energy physics. (author)

  16. High resolution data acquisition

    Science.gov (United States)

    Thornton, Glenn W.; Fuller, Kenneth R.

    1993-01-01

    A high resolution event interval timing system measures short time intervals such as occur in high energy physics or laser ranging. Timing is provided from a clock (38) pulse train (37) and analog circuitry (44) for generating a triangular wave (46) synchronously with the pulse train (37). The triangular wave (46) has an amplitude and slope functionally related to the time elapsed during each clock pulse in the train. A converter (18, 32) forms a first digital value of the amplitude and slope of the triangle wave at the start of the event interval and a second digital value of the amplitude and slope of the triangle wave at the end of the event interval. A counter (26) counts the clock pulse train (37) during the interval to form a gross event interval time. A computer (52) then combines the gross event interval time and the first and second digital values to output a high resolution value for the event interval.

  17. High energy radiation detector

    International Nuclear Information System (INIS)

    Vosburgh, K.G.

    1975-01-01

    The high energy radiation detector described comprises a set of closely spaced wedge reflectors. Each wedge reflector is composed of three sides forming identical isoceles triangles with a common apex and an open base forming an equilateral triangle. The length of one side of the base is less than the thickness of the coat of material sensitive to high energy radiation. The wedge reflectors reflect the light photons spreading to the rear of the coat in such a way that each reflected track is parallel to the incident track of the light photon spreading rearwards. The angle of the three isosceles triangles with a common apex is between 85 and 95 deg. The first main surface of the coat of high energy radiation sensitive material is in contact with the projecting edges of the surface of the wedge reflectors of the reflecting element [fr

  18. Theoretical high energy physics

    International Nuclear Information System (INIS)

    Lee, T.D.

    1993-01-01

    Brief reports are given on the work of several professors. The following areas are included: quantum chromodynamics calculations using numerical lattice gauge theory and a high-speed parallel computer; the ''spin wave'' description of bosonic particles moving on a lattice with same-site exclusion; a high-temperature expansion to 13th order for the O(4)-symmetric φ 4 model on a four-dimensional F 4 lattice; spin waves and lattice bosons; superconductivity of C 60 ; meson-meson interferometry in heavy-ion collisions; baryon number violation in the Standard Model in high-energy collisions; hard thermal loops in QCD; electromagnetic interactions of anyons; the relation between Bose-Einstein and BCS condensations; Euclidean wormholes with topology S 1 x S 2 x R; vacuum decay and symmetry breaking by radiative corrections; inflationary solutions to the cosmological horizon and flatness problems; and magnetically charged black holes

  19. High-tech entrepreneurship

    DEFF Research Database (Denmark)

    Bernasconi, Michel; Harris, Simon; Mønsted, Mette

    ; entrepreneurial finance; marketing technological innovations; and high-tech incubation management. Including case studies to give practical insights into genuine business examples, this comprehensive book has a distinctly 'real-world' focus throughout.Edited by a multi-national team, this comprehensive book......High-tech businesses form a crucial part of entrepreneurial activity - in some ways representing very typical examples of entrepreneurship, yet in some ways representing quite different challenges. The uncertainty in innovation and advanced technology makes it difficult to use conventional economic...... planning models, and also means that the management skills used in this area must be more responsive to issues of risk, uncertainty and evaluation than in conventional business opportunities. Whilst entrepreneurial courses do reflect the importance of high-tech businesses, they often lack the resources...

  20. High brightness electron accelerator

    International Nuclear Information System (INIS)

    Sheffield, R.L.; Carlsten, B.E.; Young, L.M.

    1994-01-01

    A compact high brightness linear accelerator is provided for use, e.g., in a free electron laser. The accelerator has a first plurality of accelerating cavities having end walls with four coupling slots for accelerating electrons to high velocities in the absence of quadrupole fields. A second plurality of cavities receives the high velocity electrons for further acceleration, where each of the second cavities has end walls with two coupling slots for acceleration in the absence of dipole fields. The accelerator also includes a first cavity with an extended length to provide for phase matching the electron beam along the accelerating cavities. A solenoid is provided about the photocathode that emits the electrons, where the solenoid is configured to provide a substantially uniform magnetic field over the photocathode surface to minimize emittance of the electrons as the electrons enter the first cavity. 5 figs

  1. High-energy detector

    Science.gov (United States)

    Bolotnikov, Aleksey E [South Setauket, NY; Camarda, Giuseppe [Farmingville, NY; Cui, Yonggang [Upton, NY; James, Ralph B [Ridge, NY

    2011-11-22

    The preferred embodiments are directed to a high-energy detector that is electrically shielded using an anode, a cathode, and a conducting shield to substantially reduce or eliminate electrically unshielded area. The anode and the cathode are disposed at opposite ends of the detector and the conducting shield substantially surrounds at least a portion of the longitudinal surface of the detector. The conducting shield extends longitudinally to the anode end of the detector and substantially surrounds at least a portion of the detector. Signals read from one or more of the anode, cathode, and conducting shield can be used to determine the number of electrons that are liberated as a result of high-energy particles impinge on the detector. A correction technique can be implemented to correct for liberated electron that become trapped to improve the energy resolution of the high-energy detectors disclosed herein.

  2. Timetabling at High Schools

    DEFF Research Database (Denmark)

    Sørensen, Matias

    on the publicly available XHSTT format for modeling instances and solutions of the HSTP) and the Danish High School Timetabling Problem (DHSTP). For both problems a complex Mixed-Integer Programming (MIP) model is developed, and in both cases are empirical tests performed on a large number of real-life datasets......High school institutions face a number of important planning problems during each schoolyear. This Ph.D. thesis considers two of these planning problems: The High School Timetabling Problem (HSTP) and the Consultation Timetabling Problem (CTP). Furthermore a framework for handling various planning....... The second part contains the main scienti_c papers composed during the Ph.D. study. The third part of the thesis also contains scienti_c papers, but these are included as an appendix. In the HSTP, the goal is to obtain a timetable for the forthcoming school-year. A timetable consists of lectures scheduled...

  3. High power microwaves

    CERN Document Server

    Benford, James; Schamiloglu, Edl

    2016-01-01

    Following in the footsteps of its popular predecessors, High Power Microwaves, Third Edition continues to provide a wide-angle, integrated view of the field of high power microwaves (HPMs). This third edition includes significant updates in every chapter as well as a new chapter on beamless systems that covers nonlinear transmission lines. Written by an experimentalist, a theorist, and an applied theorist, respectively, the book offers complementary perspectives on different source types. The authors address: * How HPM relates historically and technically to the conventional microwave field * The possible applications for HPM and the key criteria that HPM devices have to meet in order to be applied * How high power sources work, including their performance capabilities and limitations * The broad fundamental issues to be addressed in the future for a wide variety of source types The book is accessible to several audiences. Researchers currently in the field can widen their understanding of HPM. Present or pot...

  4. High frequency energy measurements

    International Nuclear Information System (INIS)

    Stotlar, S.C.

    1981-01-01

    High-frequency (> 100 MHz) energy measurements present special problems to the experimenter. Environment or available electronics often limit the applicability of a given detector type. The physical properties of many detectors are frequency dependent and in some cases, the physical effect employed can be frequency dependent. State-of-the-art measurements generally involve a detection scheme in association with high-speed electronics and a method of data recording. Events can be single or repetitive shot requiring real time, sampling, or digitizing data recording. Potential modification of the pulse by the detector and the associated electronics should not be overlooked. This presentation will review typical applications, methods of choosing a detector, and high-speed detectors. Special considerations and limitations of some applications and devices will be described

  5. Clustering high dimensional data

    DEFF Research Database (Denmark)

    Assent, Ira

    2012-01-01

    High-dimensional data, i.e., data described by a large number of attributes, pose specific challenges to clustering. The so-called ‘curse of dimensionality’, coined originally to describe the general increase in complexity of various computational problems as dimensionality increases, is known...... to render traditional clustering algorithms ineffective. The curse of dimensionality, among other effects, means that with increasing number of dimensions, a loss of meaningful differentiation between similar and dissimilar objects is observed. As high-dimensional objects appear almost alike, new approaches...... for clustering are required. Consequently, recent research has focused on developing techniques and clustering algorithms specifically for high-dimensional data. Still, open research issues remain. Clustering is a data mining task devoted to the automatic grouping of data based on mutual similarity. Each cluster...

  6. High-Tc superconductor applications

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    There has been much speculation about new products and business opportunities which high-Tc superconductors might make possible. However, with the exception of one Japanese survey, there have not been any recognized forecasts suggesting a timeframe and relative economic impact for proposed high-Tc products. The purpose of this survey is to provide definitive projections of the timetable for high-Tc product development, based on the combined forecasts of the leading U.S. superconductivity experts. The FTS panel of experts on high-Tc superconductor applications, representing both business and research, forecast the commercialization and economic impact for 28 classes of electronic, magnetic, communications, instrumentation, transportation, industrial, and power generation products. In most cases, forecasts predict the occurrence of developments within a 90% statistical confidence limit of 2-to-3 years. The report provides background information on the 28 application areas, as well as other information useful for strategic planners. The panel also forecast high-Tc research spending, markets, and international competitiveness, and provide insight into how the industry will evolve

  7. High strength alloys

    Science.gov (United States)

    Maziasz, Phillip James [Oak Ridge, TN; Shingledecker, John Paul [Knoxville, TN; Santella, Michael Leonard [Knoxville, TN; Schneibel, Joachim Hugo [Knoxville, TN; Sikka, Vinod Kumar [Oak Ridge, TN; Vinegar, Harold J [Bellaire, TX; John, Randy Carl [Houston, TX; Kim, Dong Sub [Sugar Land, TX

    2010-08-31

    High strength metal alloys are described herein. At least one composition of a metal alloy includes chromium, nickel, copper, manganese, silicon, niobium, tungsten and iron. System, methods, and heaters that include the high strength metal alloys are described herein. At least one heater system may include a canister at least partially made from material containing at least one of the metal alloys. At least one system for heating a subterranean formation may include a tubular that is at least partially made from a material containing at least one of the metal alloys.

  8. High loading uranium plate

    International Nuclear Information System (INIS)

    Wiencek, T.C.; Domagala, R.F.; Thresh, H.R.

    1990-01-01

    Two embodiments of a high uranium fuel plate are disclosed which contain a meat comprising structured uranium compound confined between a pari of diffusion bonded ductile metal cladding plates uniformly covering the meat, the meat hiving a uniform high fuel loading comprising a content of uranium compound greater than about 45 Vol. % at a porosity not greater than about 10 Vol. %. In a first embodiment, the meat is a plurality of parallel wires of uranium compound. In a second embodiment, the meat is a dispersion compact containing uranium compound. The fuel plates are fabricated by a hot isostatic pressing process

  9. JUNOS High Availability

    CERN Document Server

    Sonderegger, James; Milne, Kieran; Palislamovic, Senad

    2009-01-01

    Whether your network is a complex carrier or just a few machines supporting a small enterprise, JUNOS High Availability will help you build reliable and resilient networks that include Juniper Networks devices. With this book's valuable advice on software upgrades, scalability, remote network monitoring and management, high-availability protocols such as VRRP, and more, you'll have your network uptime at the five, six, or even seven nines -- or 99.99999% of the time. Rather than focus on "greenfield" designs, the authors explain how to intelligently modify multi-vendor networks. You'll learn

  10. High availability IT services

    CERN Document Server

    Critchley, Terry

    2014-01-01

    This book starts with the basic premise that a service is comprised of the 3Ps-products, processes, and people. Moreover, these entities and their sub-entities interlink to support the services that end users require to run and support a business. This widens the scope of any availability design far beyond hardware and software. It also increases the potential for service failure for reasons beyond just hardware and software; the concept of logical outages. High Availability IT Services details the considerations for designing and running highly available ""services"" and not just the systems

  11. High intensity hadron accelerators

    International Nuclear Information System (INIS)

    Teng, L.C.

    1989-05-01

    This rapporteur report consists mainly of two parts. Part I is an abridged review of the status of all High Intensity Hadron Accelerator projects in the world in semi-tabulated form for quick reference and comparison. Part II is a brief discussion of the salient features of the different technologies involved. The discussion is based mainly on my personal experiences and opinions, tempered, I hope, by the discussions I participated in in the various parallel sessions of the workshop. In addition, appended at the end is my evaluation and expression of the merits of high intensity hadron accelerators as research facilities for nuclear and particle physics

  12. Unexpected high plasma cobalamin

    DEFF Research Database (Denmark)

    Arendt, Johan F B; Nexo, Ebba

    2013-01-01

    It is well-established that more than 8% of patients examined for vitamin B12 deficiency unexpectedly have increased plasma levels of the vitamin, but so far there are no guidelines for the clinical interpretation of such findings. In this review, we summarise known associations between high plasma...... cobalamin binding proteins, transcobalamin and haptocorrin. Based on current knowledge, we suggest a strategy for the clinical interpretation of unexpected high plasma cobalamin. Since a number of the associated diseases are critical and life-threatening, the strategy promotes the concept of 'think...

  13. High speed rotary drum

    Energy Technology Data Exchange (ETDEWEB)

    Sagara, H

    1970-03-25

    A high speed rotary drum is disclosed in which the rotor vessel is a double-wall structure comprising an inner wave-shaped pipe inserted coaxially within an outer straight pipe, the object being to provide a strengthened composite light-weight structure. Since force induced axial deformation of the straight pipe and radial deformation of the corrugated pipe are small, the composite effectively resists external forces and, if the waves of the inner pipe are given a sufficient amplitude, the thickness of both pipes may be reduced to lower the overall weight. Thus high angular velocities can be obtained to separate U/sup 235/ from gaseous UF/sub 6/.

  14. Theoretical high energy physics

    International Nuclear Information System (INIS)

    Lee, T.D.

    1991-01-01

    This report discusses theoretical research in high energy physics at Columbia University. Some of the research topics discussed are: quantum chromodynamics with dynamical fermions; lattice gauge theory; scattering of neutrinos by photons; atomic physics constraints on the properties of ultralight-ultraweak gauge bosons; black holes; Chern- Simons physics; S-channel theory of superconductivity; charged boson system; gluon-gluon interactions; high energy scattering in the presence of instantons; anyon physics; causality constraints on primordial magnetic manopoles; charged black holes with scalar hair; properties of Chern-Aimona-Higgs solitons; and extended inflationary universe

  15. High Pressure Biomass Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, Pradeep K [Georgia Tech Research Corporation, Atlanta, GA (United States)

    2016-07-29

    According to the Billion Ton Report, the U.S. has a large supply of biomass available that can supplement fossil fuels for producing chemicals and transportation fuels. Agricultural waste, forest residue, and energy crops offer potential benefits: renewable feedstock, zero to low CO2 emissions depending on the specific source, and domestic supply availability. Biomass can be converted into chemicals and fuels using one of several approaches: (i) biological platform converts corn into ethanol by using depolymerization of cellulose to form sugars followed by fermentation, (ii) low-temperature pyrolysis to obtain bio-oils which must be treated to reduce oxygen content via HDO hydrodeoxygenation), and (iii) high temperature pyrolysis to produce syngas (CO + H2). This last approach consists of producing syngas using the thermal platform which can be used to produce a variety of chemicals and fuels. The goal of this project was to develop an improved understanding of the gasification of biomass at high pressure conditions and how various gasification parameters might affect the gasification behavior. Since most downstream applications of synags conversion (e.g., alcohol synthesis, Fischer-Tropsch synthesis etc) involve utilizing high pressure catalytic processes, there is an interest in carrying out the biomass gasification at high pressure which can potentially reduce the gasifier size and subsequent downstream cleaning processes. It is traditionally accepted that high pressure should increase the gasification rates (kinetic effect). There is also precedence from coal gasification literature from the 1970s that high pressure gasification would be a beneficial route to consider. Traditional approach of using thermogravimetric analyzer (TGA) or high-pressure themogravimetric analyzer (PTGA) worked well in understanding the gasification kinetics of coal gasification which was useful in designing high pressure coal gasification processes. However

  16. High temperature radioisotope capsule

    International Nuclear Information System (INIS)

    Bradshaw, G.B.

    1976-01-01

    A high temperature radioisotope capsule made up of three concentric cylinders, with the isotope fuel located within the innermost cylinder is described. The innermost cylinder has hemispherical ends and is constructed of a tantalum alloy. The intermediate cylinder is made of a molybdenum alloy and is capable of withstanding the pressure generated by the alpha particle decay of the fuel. The outer cylinder is made of a platinum alloy of high resistance to corrosion. A gas separates the innermost cylinder from the intermediate cylinder and the intermediate cylinder from the outer cylinder

  17. High alumina refractories

    International Nuclear Information System (INIS)

    Simao, L.C.; Lopes, A.B.; Galvao Filho, N.B.; Souza, R.B. de

    1989-01-01

    High alumina refractories with 92 to 96.5% Al 2 O 3 were produced using brown and white fused as aggregate. Those refractories present only alumina-α and mullite as crystalline mineralogical phase. Other physical and chemical characteristics are similar to the ones found in refractories produced in Brazil, Japan and U.S.A. The most important physical and chemical tests used for the characterization of the raw materials and refractories, complemented by those realized at high temperatures, plus X-ray Difractometry and optical microscopy are presented, besides the refractory formulation and main parameters of production [pt

  18. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  19. Highly Robust Methods in Data Mining

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2013-01-01

    Roč. 8, č. 1 (2013), s. 9-24 ISSN 1452-4864 Institutional support: RVO:67985807 Keywords : data mining * robust statistics * high-dimensional data * cluster analysis * logistic regression * neural networks Subject RIV: BB - Applied Statistics, Operational Research

  20. High-speed data search

    Science.gov (United States)

    Driscoll, James N.

    1994-01-01

    The high-speed data search system developed for KSC incorporates existing and emerging information retrieval technology to help a user intelligently and rapidly locate information found in large textual databases. This technology includes: natural language input; statistical ranking of retrieved information; an artificial intelligence concept called semantics, where 'surface level' knowledge found in text is used to improve the ranking of retrieved information; and relevance feedback, where user judgements about viewed information are used to automatically modify the search for further information. Semantics and relevance feedback are features of the system which are not available commercially. The system further demonstrates focus on paragraphs of information to decide relevance; and it can be used (without modification) to intelligently search all kinds of document collections, such as collections of legal documents medical documents, news stories, patents, and so forth. The purpose of this paper is to demonstrate the usefulness of statistical ranking, our semantic improvement, and relevance feedback.

  1. High School Principals and the High School Journalism Program.

    Science.gov (United States)

    Peterson, Jane W.

    A study asked selected high school principals to respond to statements about the value of high school journalism to the high school student and about the rights and responsibilities of the high school journalist. These responses were then checked against such information as whether or not the high school principal had worked on a high school…

  2. High performance conductometry

    International Nuclear Information System (INIS)

    Saha, B.

    2000-01-01

    Inexpensive but high performance systems have emerged progressively for basic and applied measurements in physical and analytical chemistry on one hand, and for on-line monitoring and leak detection in plants and facilities on the other. Salient features of the developments will be presented with specific examples

  3. High beta tokamaks

    International Nuclear Information System (INIS)

    Dory, R.A.; Berger, D.P.; Charlton, L.A.; Hogan, J.T.; Munro, J.K.; Nelson, D.B.; Peng, Y.K.M.; Sigmar, D.J.; Strickler, D.J.

    1978-01-01

    MHD equilibrium, stability, and transport calculations are made to study the accessibility and behavior of ''high beta'' tokamak plasmas in the range β approximately 5 to 15 percent. For next generation devices, beta values of at least 8 percent appear to be accessible and stable if there is a conducting surface nearby

  4. High temperature niobium alloys

    International Nuclear Information System (INIS)

    Wojcik, C.C.

    1991-01-01

    Niobium alloys are currently being used in various high temperature applications such as rocket propulsion, turbine engines and lighting systems. This paper presents an overview of the various commercial niobium alloys, including basic manufacturing processes, properties and applications. Current activities for new applications include powder metallurgy, coating development and fabrication of advanced porous structures for lithium cooled heat pipes

  5. High School Book Fairs

    Science.gov (United States)

    Fitzgerald, Marianne

    2006-01-01

    Many secondary students have given up the joy of reading. When asked why they don't read for pleasure, students came up with many different reasons, the first being lack of time. High school students are busy with after school jobs, sports, homework, etc. With the growing number of students enrolled in AP classes, not only is there not much time…

  6. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Carbohydrate Counting Make Your Carbs Count Glycemic Index Low-Calorie Sweeteners Sugar and Desserts Fitness Exercise & Type ... Checking Your Blood Glucose A1C and eAG Hypoglycemia (Low blood glucose) Hyperglycemia (High blood glucose) Dawn Phenomenon ...

  7. Highly oxidized superconductors

    Science.gov (United States)

    Morris, Donald E.

    1994-01-01

    Novel superconducting materials in the form of compounds, structures or phases are formed by performing otherwise known syntheses in a highly oxidizing atmosphere rather than that created by molecular oxygen at atmospheric pressure or below. This leads to the successful synthesis of novel superconducting compounds which are thermodynamically stable at the conditions under which they are formed.

  8. High strength ferritic alloy

    International Nuclear Information System (INIS)

    1977-01-01

    A high strength ferritic steel is specified in which the major alloying elements are chromium and molybdenum, with smaller quantities of niobium, vanadium, silicon, manganese and carbon. The maximum swelling is specified for various irradiation conditions. Rupture strength is also specified. (U.K.)

  9. High temperature interface superconductivity

    International Nuclear Information System (INIS)

    Gozar, A.; Bozovic, I.

    2016-01-01

    Highlight: • This review article covers the topic of high temperature interface superconductivity. • New materials and techniques used for achieving interface superconductivity are discussed. • We emphasize the role played by the differences in structure and electronic properties at the interface with respect to the bulk of the constituents. - Abstract: High-T_c superconductivity at interfaces has a history of more than a couple of decades. In this review we focus our attention on copper-oxide based heterostructures and multi-layers. We first discuss the technique, atomic layer-by-layer molecular beam epitaxy (ALL-MBE) engineering, that enabled High-T_c Interface Superconductivity (HT-IS), and the challenges associated with the realization of high quality interfaces. Then we turn our attention to the experiments which shed light on the structure and properties of interfacial layers, allowing comparison to those of single-phase films and bulk crystals. Both ‘passive’ hetero-structures as well as surface-induced effects by external gating are discussed. We conclude by comparing HT-IS in cuprates and in other classes of materials, especially Fe-based superconductors, and by examining the grand challenges currently laying ahead for the field.

  10. High altitude organic gold

    DEFF Research Database (Denmark)

    Pouliot, Mariève; Pyakurel, Dipesh; Smith-Hall, Carsten

    2018-01-01

    Ethnopharmacological relevance Ophiocordyceps sinensis (Berk.) G.H.Sung, J.M.Sung, Hywel-Jones & Spatafora, a high altitude Himalayan fungus-caterpillar product found in alpine meadows in China, Bhutan, Nepal, and India, has been used in the Traditional Chinese Medicine system for over 2000 years...

  11. High Brightness OLED Lighting

    Energy Technology Data Exchange (ETDEWEB)

    Spindler, Jeffrey [OLEDWorks LLC; Kondakova, Marina [OLEDWorks LLC; Boroson, Michael [OLEDWorks LLC; Hamer, John [OLEDWorks LLC

    2016-05-25

    In this work we describe the technology developments behind our current and future generations of high brightness OLED lighting panels. We have developed white and amber OLEDs with excellent performance based on the stacking approach. Current products achieve 40-60 lm/W, while future developments focus on achieving 80 lm/W or higher.

  12. Knees Lifted High

    Centers for Disease Control (CDC) Podcasts

    The Eagle Books are a series of four books that are brought to life by wise animal characters - Mr. Eagle, Miss Rabbit, and Coyote - who engage Rain That Dances and his young friends in the joy of physical activity, eating healthy foods, and learning from their elders about health and diabetes prevention. Knees Lifted High gives children fun ideas for active outdoor play.

  13. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... In Memory In Honor Become a Member En Español Type 1 Type 2 About Us Online Community ... Page Text Size: A A A Listen En Español Hyperglycemia (High Blood Glucose) Hyperglycemia is the technical ...

  14. Theoretical high energy physics

    International Nuclear Information System (INIS)

    Lee, T.D.

    1992-01-01

    This progress report discusses research by Columbia University staff in high energy physics. Some of the topics discussed are as follows: lattice gauge theory; quantum chromodynamics; parity doublets; solitons; baryon number violation; black holes; magnetic monopoles; gluon plasma; Chern-Simons theory; and the inflationary universe

  15. High-Sensitivity Spectrophotometry.

    Science.gov (United States)

    Harris, T. D.

    1982-01-01

    Selected high-sensitivity spectrophotometric methods are examined, and comparisons are made of their relative strengths and weaknesses and the circumstances for which each can best be applied. Methods include long path cells, noise reduction, laser intracavity absorption, thermocouple calorimetry, photoacoustic methods, and thermo-optical methods.…

  16. Investing in High School

    Science.gov (United States)

    Green, Daniel G.

    2012-01-01

    Strapped for cash, a Massachusetts high school creates its own venture capital fund to incentivize teachers to create programs that improve student learning. The result has been higher test scores and higher job satisfaction. One important program is credited with helping close the achievement gap at the school, while others have helped ambitious…

  17. High-fiber foods

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/patientinstructions/000193.htm High-fiber foods To use the sharing features on this page, ... Read food labels carefully to see how much fiber they have. Choose foods that have higher amounts of fiber, such as ...

  18. High resolution drift chambers

    International Nuclear Information System (INIS)

    Va'vra, J.

    1985-07-01

    High precision drift chambers capable of achieving less than or equal to 50 μm resolutions are discussed. In particular, we compare so called cool and hot gases, various charge collection geometries, several timing techniques and we also discuss some systematic problems. We also present what we would consider an ''ultimate'' design of the vertex chamber. 50 refs., 36 figs., 6 tabs

  19. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Text Size: A A A Listen En Español Hyperglycemia (High Blood Glucose) Hyperglycemia is the technical term ... body can't use insulin properly. What Causes Hyperglycemia? A number of things can cause hyperglycemia: If ...

  20. High luminosity particle colliders

    International Nuclear Information System (INIS)

    Palmer, R.B.; Gallardo, J.C.

    1997-03-01

    The authors consider the high energy physics advantages, disadvantages and luminosity requirements of hadron (pp, p anti p), lepton (e + e - , μ + μ - ) and photon-photon colliders. Technical problems in obtaining increased energy in each type of machine are presented. The machines relative size are also discussed

  1. Fascination at high pressures

    International Nuclear Information System (INIS)

    Chidambaram, R.

    1992-01-01

    Research at high pressures has developed into an interdisciplinary area which has important implications for and applications in the areas of physics, chemistry, materials sciences, planetary sciences, biology, engineering sciences and technology. The state of-the-art in this field is reviewed and future directions are indicated. (M.G.B.)

  2. CSTI High Capacity Power

    International Nuclear Information System (INIS)

    Winter, J.M.

    1989-01-01

    The SP-100 program was established in 1983 by DOD, DOE, and NASA as a joint program to develop the technology necessary for space nuclear power systems for military and civil application. During FY-86 and 87, the NASA SP-100 Advanced Technology Program was devised to maintain the momentum of promising technology advancement efforts started during Phase 1 of SP-100 and to strengthen, in key areas, the chances for successful development and growth capability of space nuclear reactor power systems for future space applications. In FY-88, the Advanced Technology Program was incorporated into NASA's new Civil Space Technology Initiative (CSTI). The CSTI Program was established to provide the foundation for technology development in automation and robotics, information, propulsion, and power. The CSTI High Capacity Power Program builds on the technology efforts of the SP-100 program, incorporates the previous NASA SP-100 Advanced Technology project, and provides a bridge to NASA Project Pathfinder. The elements of CSTI High Capacity Power development include Conversion Systems, Thermal Management, Power Management, System Diagnostics, and Environmental Interactions. Technology advancement in all areas, including materials, is required to assure the high reliability and 7 to 10 year lifetime demanded for future space nuclear power systems. The overall program will develop and demonstrate the technology base required to provide a wide range of modular power systems as well as allowing mission independence from solar and orbital attitude requirements. Several recent advancements in CSTI High Capacity power development will be discussed

  3. High Collection Nonimaging Optics

    Science.gov (United States)

    Winston, Roland

    1989-07-01

    Nonimaging optics departs from the methods of traditional optical design to develop instead techniques for maximizing the collecting power of concentrating elements and systems. Designs which exceed the concentration attainable with focusing techniques by factors of four or more and approach the theoretical limit are possible (ideal concentrators). The methodology for designing high collection nonirnaging systems is described.

  4. High energy astrophysics

    International Nuclear Information System (INIS)

    Shklorsky, I.S.

    1979-01-01

    A selected list of articles of accessible recent review articles and conference reports, wherein up-to-date summaries of various topics in the field of high energy astrophysics can be found, is presented. A special report outlines work done in the Soviet Union in this area. (Auth.)

  5. High frequency electromagnetic dosimetry

    CERN Document Server

    Sánchez-Hernández, David A

    2009-01-01

    Along with the growth of RF and microwave technology applications, there is a mounting concern about the possible adverse effects over human health from electromagnetic radiation. Addressing this issue and putting it into perspective, this groundbreaking resource provides critical details on the latest advances in high frequency electromagnetic dosimetry.

  6. Reshaping High School English.

    Science.gov (United States)

    Pirie, Bruce

    This book takes up the question of what shape high school English studies should take in the coming years. It describes an English program that blends philosophical depth with classroom practicality. Drawing examples from commonly taught texts such as "Macbeth,""To Kill a Mockingbird," and "Lord of the Flies," the…

  7. Nuclei in high forms

    International Nuclear Information System (INIS)

    Szymanski, Z.; Berger, J.F.; Heenen, P.H.; Heyde, K.; Haas, B.; Janssens, R.; Paya, D.; Gogny, D.; Huber, G.; Bjoernholm, S.; Brack, M.

    1991-01-01

    The purpose of 1991 Joliot-Curie Summer School is to review the most advances in the understanding of the nuclei physics after the considerable progress in gamma spectroscopy. It covers the following topics: Highly and super-deformed nuclei, nuclear structures, mean-field approach and beyond, fission isomers, nuclear excitations with long lifetime and metal clusters

  8. General Algorithm (High level)

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. General Algorithm (High level). Iteratively. Use Tightness Property to remove points of P1,..,Pi. Use random sampling to get a Random Sample (of enough points) from the next largest cluster, Pi+1. Use the Random Sampling Procedure to approximate ci+1 using the ...

  9. High Temperature Electrolysis

    DEFF Research Database (Denmark)

    Elder, Rachael; Cumming, Denis; Mogensen, Mogens Bjerg

    2015-01-01

    High temperature electrolysis of carbon dioxide, or co-electrolysis of carbon dioxide and steam, has a great potential for carbon dioxide utilisation. A solid oxide electrolysis cell (SOEC), operating between 500 and 900. °C, is used to reduce carbon dioxide to carbon monoxide. If steam is also i...

  10. High on walking

    DEFF Research Database (Denmark)

    Woythal, Bente Martinsen; Haahr, Anita; Dreyer, Pia

    2018-01-01

    a leg, and people who live with Parkinson’s disease. The analysis of the data is inspired by Paul Ricoeur’s philosophy of interpretation. Four themes were identified: (a) I feel high in two ways; (b) Walking has to be automatic; (c) Every Monday, I walk with the girls in the park; and (d) I dream...

  11. High-temperature superconductivity

    International Nuclear Information System (INIS)

    Lynn, J.W.

    1990-01-01

    This book discusses development in oxide materials with high superconducting transition temperature. Systems with Tc well above liquid nitrogen temperature are already a reality and higher Tc's are anticipated. The author discusses how the idea of a room-temperature superconductor appears to be a distinctly possible outcome of materials research

  12. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... You At Risk? Diabetes Basics Living with Diabetes Food & Fitness In My Community Advocacy Research & Practice Ways to Give Close Are You at Risk? Home Prevention Diagnosing Diabetes and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy Eating Overweight Smoking High ...

  13. High brightness ion source

    International Nuclear Information System (INIS)

    Dreyfus, R.W.; Hodgson, R.T.

    1975-01-01

    A high brightness ion beam is obtainable by using lasers to excite atoms or molecules from the ground state to an ionized state in increments, rather than in one step. The spectroscopic resonances of the atom or molecule are used so that relatively long wavelength, low power lasers can be used to obtain such ion beam

  14. High energy battery. Hochenergiebatterie

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, H.; Beyermann, G.; Bulling, M.

    1992-03-26

    In a high energy battery with a large number of individual cells in a housing with a cooling medium flowing through it, it is proposed that the cooling medium should be guided so that it only affects one or both sides of the cells thermally.

  15. Ghana's high forests

    NARCIS (Netherlands)

    Oduro, K.A.

    2016-01-01

    Deforestation and forest degradation in the tropics have been receiving both scientific and political attention in recent decades due to its impacts on the environment and on human livelihoods. In Ghana, the continuous decline of forest resources and the high demand for timber have raised

  16. High energy beam cooling

    International Nuclear Information System (INIS)

    Berger, H.; Herr, H.; Linnecar, T.; Millich, A.; Milss, F.; Rubbia, C.; Taylor, C.S.; Meer, S. van der; Zotter, B.

    1980-01-01

    The group concerned itself with the analysis of cooling systems whose purpose is to maintain the quality of the high energy beams in the SPS in spite of gas scattering, RF noise, magnet ripple and beam-beam interactions. Three types of systems were discussed. The status of these activities is discussed below. (orig.)

  17. Danish High Performance Concretes

    DEFF Research Database (Denmark)

    Nielsen, M. P.; Christoffersen, J.; Frederiksen, J.

    1994-01-01

    In this paper the main results obtained in the research program High Performance Concretes in the 90's are presented. This program was financed by the Danish government and was carried out in cooperation between The Technical University of Denmark, several private companies, and Aalborg University...... concretes, workability, ductility, and confinement problems....

  18. High energy colliders

    International Nuclear Information System (INIS)

    Palmer, R.B.; Gallardo, J.C.

    1997-02-01

    The authors consider the high energy physics advantages, disadvantages and luminosity requirements of hadron (pp, p anti p), lepton (e + e - , μ + μ - ) and photon-photon colliders. Technical problems in obtaining increased energy in each type of machine are presented. The machines relative size are also discussed

  19. Rocky Mountain High.

    Science.gov (United States)

    Hill, David

    2001-01-01

    Describes Colorado's Eagle Rock School, which offers troubled teens a fresh start by transporting them to a tuition- free campus high in the mountains. The program encourages spiritual development as well as academic growth. The atmosphere is warm, loving, structured, and nonthreatening. The article profiles several students' experiences at the…

  20. High performance homes

    DEFF Research Database (Denmark)

    Beim, Anne; Vibæk, Kasper Sánchez

    2014-01-01

    . Consideration of all these factors is a precondition for a truly integrated practice and as this chapter demonstrates, innovative project delivery methods founded on the manufacturing of prefabricated buildings contribute to the production of high performance homes that are cost effective to construct, energy...

  1. High Energy Physics

    Science.gov (United States)

    Untitled Document [Argonne Logo] [DOE Logo] High Energy Physics Home Division ES&H Personnel Collider Physics Cosmic Frontier Cosmic Frontier Theory & Computing Detector R&D Electronic Design Mechanical Design Neutrino Physics Theoretical Physics Seminars HEP Division Seminar HEP Lunch Seminar HEP

  2. High Selectivity Oxygen Delignification

    Energy Technology Data Exchange (ETDEWEB)

    Arthur J. Ragauskas

    2005-09-30

    The overall objective of this program was to develop improved extended oxygen delignification (EOD) technologies for current U.S. pulp mill operations. This was accomplished by: (1) Identifying pulping conditions that optimize O and OO performance; (2) Identifying structural features of lignin that enhance reactivity towards EOD of high kappa pulps; (3) Identifying factors minimizing carbohydrate degradation and improve pulp strength of EOD high kappa pulps; (4) Developing a simple, reproducible method of quantifying yield gains from EOD; and (5) Developing process conditions that significantly reduce the capital requirements of EOD while optimizing the yield benefits. Key research outcomes included, demonstrating the use of a mini-O sequence such as (E+O)Dkf:0.05(E+O) or Dkf:0.05(E+O)(E+O) without interstage washing could capture approximately 60% of the delignification efficiency of a conventional O-stage without the major capital requirements associated with an O-stage for conventional SW kraft pulps. The rate of formation and loss of fiber charge during an O-stage stage can be employed to maximize net fiber charge. Optimal fiber charge development and delignification are two independent parameters and do not parallel each other. It is possible to utilize an O-stage to enhance overall cellulosic fiber charge of low and high kappa SW kraft pulps which is beneficial for physical strength properties. The application of NIR and multi-variant analysis was developed into a rapid and simple method of determining the yield of pulp from an oxygen delignification stage that has real-world mill applications. A focus point of this program was the demonstration that Kraft pulping conditions and oxygen delignification of high and low-kappa SW and HW pulps are intimately related. Improved physical pulp properties and yield can be delivered by controlling the H-factor and active alkali charge. Low AA softwood kraft pulp with a kappa number 30 has an average improvement of 2% in

  3. Very high multiplicity hadron processes

    International Nuclear Information System (INIS)

    Mandzhavidze, I.; Sisakyan, A.

    2000-01-01

    The paper contains a description of a first attempt to understand the extremely inelastic high energy hadron collisions, when the multiplicity of produced hadrons considerably exceeds its mean value. Problems with existing model predictions are discussed. The real-time finite-temperature S-matrix theory is built to have a possibility to find model-free predictions. This allows one to include the statistical effects into consideration and build the phenomenology. The questions to experiment are formulated at the very end of the paper

  4. High Voltage Seismic Generator

    Science.gov (United States)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes

  5. Highly pathogenic avian influenza.

    Science.gov (United States)

    Swayne, D E; Suarez, D L

    2000-08-01

    Highly pathogenic (HP) avian influenza (AI) (HPAI) is an extremely contagious, multi-organ systemic disease of poultry leading to high mortality, and caused by some H5 and H7 subtypes of type A influenza virus, family Orthomyxoviridae. However, most AI virus strains are mildly pathogenic (MP) and produce either subclinical infections or respiratory and/or reproductive diseases in a variety of domestic and wild bird species. Highly pathogenic avian influenza is a List A disease of the Office International des Epizooties, while MPAI is neither a List A nor List B disease. Eighteen outbreaks of HPAI have been documented since the identification of AI virus as the cause of fowl plague in 1955. Mildly pathogenic avian influenza viruses are maintained in wild aquatic bird reservoirs, occasionally crossing over to domestic poultry and causing outbreaks of mild disease. Highly pathogenic avian influenza viruses do not have a recognised wild bird reservoir, but can occasionally be isolated from wild birds during outbreaks in domestic poultry. Highly pathogenic avian influenza viruses have been documented to arise from MPAI viruses through mutations in the haemagglutinin surface protein. Prevention of exposure to the virus and eradication are the accepted methods for dealing with HPAI. Control programmes, which imply allowing a low incidence of infection, are not an acceptable method for managing HPAI, but have been used during some outbreaks of MPAI. The components of a strategy to deal with MPAI or HPAI include surveillance and diagnosis, biosecurity, education, quarantine and depopulation. Vaccination has been used in some control and eradication programmes for AI.

  6. High population increase rates.

    Science.gov (United States)

    1991-09-01

    In addition to its economic and ethnic difficulties, the USSR faces several pressing demographic problems, including high population increase rates in several of its constituent republics. It has now become clear that although the country's rigid centralized planning succeeded in covering the basic needs of people, it did not lead to welfare growth. Since the 1970s, the Soviet economy has remained sluggish, which as led to increase in the death and birth rates. Furthermore, the ideology that held that demography could be entirely controlled by the country's political and economic system is contradicted by current Soviet reality, which shows that religion and ethnicity also play a significant role in demographic dynamics. Currently, Soviet republics fall under 2 categories--areas with high or low natural population increase rates. Republics with low rates consist of Christian populations (Armenia, Moldavia, Georgia, Byelorussia, Russia, Lithuania, Estonia, Latvia, Ukraine), while republics with high rates are Muslim (Tadzhikistan, Uzbekistan, Turkmenistan, Kirgizia, Azerbaijan Kazakhstan). The later group has natural increase rates as high as 3.3%. Although the USSR as a whole is not considered a developing country, the later group of republics fit the description of the UNFPA's priority list. Another serious demographic issue facing the USSR is its extremely high rate of abortion. This is especially true in the republics of low birth rates, where up to 60% of all pregnancies are terminated by induced abortions. Up to 1/5 of the USSR's annual health care budget is spent on clinical abortions -- money which could be better spent on the production of contraceptives. Along with the recent political and economic changes, the USSR is now eager to deal with its demographic problems.

  7. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  8. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics Living With Diabetes Recently Diagnosed Treatment & Care Complications ...

  9. Hyperglycemia (High Blood Glucose)

    Medline Plus

    Full Text Available ... Gestational Myths Statistics Common Terms Genetics Living With Diabetes Recently Diagnosed Treatment & Care Complications Health Insurance For Parents & Kids Know Your Rights ...

  10. High energy nuclear physics

    International Nuclear Information System (INIS)

    Meyer, J.

    1988-01-01

    The 1988 progress report of the High Energy Nuclear Physics laboratory (Polytechnic School, France), is presented. The Laboratory research program is focused on the fundamental physics of interactions, on the new techniques for the acceleration of charged particles and on the nuclei double beta decay. The experiments are performed on the following topics: the measurement of the π 0 inclusive production and the photons production in very high energy nuclei-nuclei interactions and the nucleon stability. Concerning the experiments under construction, a new detector for LEP, the study and simulation of the hadronic showers in a calorimeter and the H1 experiment (HERA), are described. The future research programs and the published papers are listed [fr

  11. Dual Campus High School

    Directory of Open Access Journals (Sweden)

    Carmen P. Mombourquette

    2013-04-01

    Full Text Available September 2010 witnessed the opening of the first complete dual campus high school in Alberta. Catholic Central High School, which had been in existence since 1967 in one building, now offered courses to students on two campuses. The “dual campus” philosophy was adopted so as to ensure maximum program flexibility for students. The philosophy, however, was destined to affect student engagement and staff efficacy as the change in organizational structure, campus locations, and course availability was dramatic. Changing school organizational structure also had the potential of affecting student achievement. A mixed-methods study utilizing engagement surveys, efficacy scales, and interviews with students and teachers was used to ascertain the degree of impact. The results of the study showed that minimal impact occurred to levels of student engagement, minor negative impact to staff efficacy, and a slight increase to student achievement results.

  12. High resolution photoelectron spectroscopy

    International Nuclear Information System (INIS)

    Arko, A.J.

    1988-01-01

    Photoelectron Spectroscopy (PES) covers a very broad range of measurements, disciplines, and interests. As the next generation light source, the FEL will result in improvements over the undulator that are larger than the undulater improvements over bending magnets. The combination of high flux and high inherent resolution will result in several orders of magnitude gain in signal to noise over measurements using synchrotron-based undulators. The latter still require monochromators. Their resolution is invariably strongly energy-dependent so that in the regions of interest for many experiments (h upsilon > 100 eV) they will not have a resolving power much over 1000. In order to study some of the interesting phenomena in actinides (heavy fermions e.g.) one would need resolving powers of 10 4 to 10 5 . These values are only reachable with the FEL

  13. High-Performance Networking

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    The series will start with an historical introduction about what people saw as high performance message communication in their time and how that developed to the now to day known "standard computer network communication". It will be followed by a far more technical part that uses the High Performance Computer Network standards of the 90's, with 1 Gbit/sec systems as introduction for an in depth explanation of the three new 10 Gbit/s network and interconnect technology standards that exist already or emerge. If necessary for a good understanding some sidesteps will be included to explain important protocols as well as some necessary details of concerned Wide Area Network (WAN) standards details including some basics of wavelength multiplexing (DWDM). Some remarks will be made concerning the rapid expanding applications of networked storage.

  14. High temperature structural silicides

    International Nuclear Information System (INIS)

    Petrovic, J.J.

    1997-01-01

    Structural silicides have important high temperature applications in oxidizing and aggressive environments. Most prominent are MoSi 2 -based materials, which are borderline ceramic-intermetallic compounds. MoSi 2 single crystals exhibit macroscopic compressive ductility at temperatures below room temperature in some orientations. Polycrystalline MoSi 2 possesses elevated temperature creep behavior which is highly sensitive to grain size. MoSi 2 -Si 3 N 4 composites show an important combination of oxidation resistance, creep resistance, and low temperature fracture toughness. Current potential applications of MoSi 2 -based materials include furnace heating elements, molten metal lances, industrial gas burners, aerospace turbine engine components, diesel engine glow plugs, and materials for glass processing

  15. High temperature materials characterization

    Science.gov (United States)

    Workman, Gary L.

    1990-01-01

    A lab facility for measuring elastic moduli up to 1700 C was constructed and delivered. It was shown that the ultrasonic method can be used to determine elastic constants of materials from room temperature to their melting points. The ease in coupling high frequency acoustic energy is still a difficult task. Even now, new coupling materials and higher power ultrasonic pulsers are being suggested. The surface was only scratched in terms of showing the full capabilities of either technique used, especially since there is such a large learning curve in developing proper methodologies to take measurements into the high temperature region. The laser acoustic system does not seem to have sufficient precision at this time to replace the normal buffer rod methodology.

  16. High field electron linacs

    International Nuclear Information System (INIS)

    Le Duff, J.

    1985-12-01

    High field electron linacs are considered as potential candidates to provide very high energies beyond LEP. Since almost twenty years not much improvement has been made on linac technologies as they have been mostly kept at low and medium energies to be used as injectors for storage rings. Today, both their efficiency and their performances are being reconsidered, and for instance the pulse compression sheme developed at SLAC and introduced to upgrade the energy of that linac is a first step towards a new generation of linear accelerators. However this is not enough in terms of power consumption and more development is needed to improve both the efficiency of accelerating structures and the performances of RF power sources

  17. High latitude ionospheric structure

    International Nuclear Information System (INIS)

    1984-06-01

    The Earth's ionosphere is an important element in solar-terrestrial energy transfer processes. As a major terrestrial sink for many solar and magnetospheric events, the ionosphere has characteristic features that are traced to such seemingly remote phenomena as solar flares, radiation belt wave-particle interactions and magnetospheric substorms. In considering the multiple of solar-terrestrial plasma interactions, it is important to recognize that the high-latitude ionosphere is not altogether a simple receptor of various energy deposition processes. The high-altitude ionosphere plays an active feedback role by controlling the conductivity at the base of far-reaching magnetic field lines and by providing a plasma source for the magnetosphere. Indeed, the role of the ionosphere during magnetospheric substorms is emerging as a topic for meaningful study in the overall picture of magnetospheric-ionospheric coupling

  18. High voltage isolation transformer

    Science.gov (United States)

    Clatterbuck, C. H.; Ruitberg, A. P. (Inventor)

    1985-01-01

    A high voltage isolation transformer is provided with primary and secondary coils separated by discrete electrostatic shields from the surfaces of insulating spools on which the coils are wound. The electrostatic shields are formed by coatings of a compound with a low electrical conductivity which completely encase the coils and adhere to the surfaces of the insulating spools adjacent to the coils. Coatings of the compound also line axial bores of the spools, thereby forming electrostatic shields separating the spools from legs of a ferromagnetic core extending through the bores. The transformer is able to isolate a high constant potential applied to one of its coils, without the occurrence of sparking or corona, by coupling the coatings, lining the axial bores to the ferromagnetic core and by coupling one terminal of each coil to the respective coating encasing the coil.

  19. High thermal load component

    International Nuclear Information System (INIS)

    Fuse, Toshiaki; Tachikawa, Nobuo.

    1996-01-01

    A cooling tube made of a pure copper is connected to the inner portion of an armour (heat resistant member) made of an anisotropic carbon/carbon composite (CFC) material. The CFC material has a high heat conductivity in longitudinal direction of fibers and has low conductivity in perpendicular thereto. Fibers extending in the armour from a heat receiving surface just above the cooling tube are directly connected to the cooling tube. A portion of the fibers extending from a heat receiving surface other than portions not just above the cooling tube is directly bonded to the cooling tube. Remaining fibers are disposed so as to surround the cooling tube. The armour and the cooling tube are soldered using an active metal flux. With such procedures, high thermal load components for use in a thermonuclear reactor are formed, which are excellent in a heat removing characteristic and hardly causes defects such as crackings and peeling. (I.N.)

  20. High Latitude Polygons

    Science.gov (United States)

    2005-01-01

    26 September 2005 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows polygonal patterned ground on a south high-latitude plain. The outlines of the polygons, like the craters and hills in this region, are somewhat enhanced by the presence of bright frost left over from the previous winter. On Earth, polygons at high latitudes would usually be attributed to the seasonal freezing and thawing cycles of ground ice. The origin of similar polygons on Mars is less certain, but might also be an indicator of ground ice. Location near: 75.3oS, 113.2oW Image width: width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Spring