WorldWideScience

Sample records for statistically realistic populations

  1. Realistic thermodynamic and statistical-mechanical measures for neural synchronization.

    Science.gov (United States)

    Kim, Sang-Yoon; Lim, Woochang

    2014-04-15

    Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.

  2. Quantifying introgression risk with realistic population genetics

    OpenAIRE

    Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy

    2012-01-01

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, rep...

  3. Quantifying introgression risk with realistic population genetics.

    Science.gov (United States)

    Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy

    2012-12-07

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.

  4. Quantifying introgression risk with realistic population genetics

    NARCIS (Netherlands)

    Ghosh, A.; Meirmans, P.G.; Haccou, P.

    2012-01-01

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified

  5. Photo-Realistic Statistical Skull Morphotypes: New Exemplars for Ancestry and Sex Estimation in Forensic Anthropology.

    Science.gov (United States)

    Caple, Jodi; Stephan, Carl N

    2017-05-01

    Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.

  6. Statistical thermodynamics of clustered populations.

    Science.gov (United States)

    Matsoukas, Themis

    2014-08-01

    We present a thermodynamic theory for a generic population of M individuals distributed into N groups (clusters). We construct the ensemble of all distributions with fixed M and N, introduce a selection functional that embodies the physics that governs the population, and obtain the distribution that emerges in the scaling limit as the most probable among all distributions consistent with the given physics. We develop the thermodynamics of the ensemble and establish a rigorous mapping to regular thermodynamics. We treat the emergence of a so-called giant component as a formal phase transition and show that the criteria for its emergence are entirely analogous to the equilibrium conditions in molecular systems. We demonstrate the theory by an analytic model and confirm the predictions by Monte Carlo simulation.

  7. Statistical Processing Algorithms for Human Population Databases

    Directory of Open Access Journals (Sweden)

    Camelia COLESCU

    2012-01-01

    Full Text Available The article is describing some algoritms for statistic functions aplied to a human population database. The samples are specific for the most interesting periods, when the evolution of statistical datas has spectacolous value. The article describes the most usefull form of grafical prezentation of the results

  8. Statistical Models of Adaptive Immune populations

    Science.gov (United States)

    Sethna, Zachary; Callan, Curtis; Walczak, Aleksandra; Mora, Thierry

    The availability of large (104-106 sequences) datasets of B or T cell populations from a single individual allows reliable fitting of complex statistical models for naïve generation, somatic selection, and hypermutation. It is crucial to utilize a probabilistic/informational approach when modeling these populations. The inferred probability distributions allow for population characterization, calculation of probability distributions of various hidden variables (e.g. number of insertions), as well as statistical properties of the distribution itself (e.g. entropy). In particular, the differences between the T cell populations of embryonic and mature mice will be examined as a case study. Comparing these populations, as well as proposed mixed populations, provides a concrete exercise in model creation, comparison, choice, and validation.

  9. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  10. Statistical dynamics of regional populations and economies

    Science.gov (United States)

    Huo, Jie; Wang, Xu-Ming; Hao, Rui; Wang, Peng

    Quantitative analysis of human behavior and social development is becoming a hot spot of some interdisciplinary studies. A statistical analysis on the population and GDP of 150 cities in China from 1990 to 2013 is conducted. The result indicates the cumulative probability distribution of the populations and that of the GDPs obeying the shifted power law, respectively. In order to understand these characteristics, a generalized Langevin equation describing variation of population is proposed, which is based on the correlations between population and GDP as well as the random fluctuations of the related factors. The equation is transformed into the Fokker-Plank equation to express the evolution of population distribution. The general solution demonstrates a transition of the distribution from the normal Gaussian distribution to a shifted power law, which suggests a critical point of time at which the transition takes place. The shifted power law distribution in the supercritical situation is qualitatively in accordance with the practical result. The distribution of the GDPs is derived from the well-known Cobb-Douglas production function. The result presents a change, in supercritical situation, from a shifted power law to the Gaussian distribution. This is a surprising result-the regional GDP distribution of our world will be the Gaussian distribution one day in the future. The discussions based on the changing trend of economic growth suggest it will be true. Therefore, these theoretical attempts may draw a historical picture of our society in the aspects of population and economy.

  11. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  12. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  13. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    Science.gov (United States)

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  14. Constraining the Statistics of Population III Binaries

    Science.gov (United States)

    Stacy, Athena; Bromm, Volker

    2012-01-01

    We perform a cosmological simulation in order to model the growth and evolution of Population III (Pop III) stellar systems in a range of host minihalo environments. A Pop III multiple system forms in each of the ten minihaloes, and the overall mass function is top-heavy compared to the currently observed initial mass function in the Milky Way. Using a sink particle to represent each growing protostar, we examine the binary characteristics of the multiple systems, resolving orbits on scales as small as 20 AU. We find a binary fraction of approx. 36, with semi-major axes as large as 3000 AU. The distribution of orbital periods is slightly peaked at approx. < 900 yr, while the distribution of mass ratios is relatively flat. Of all sink particles formed within the ten minihaloes, approx. 50 are lost to mergers with larger sinks, and 50 of the remaining sinks are ejected from their star-forming disks. The large binary fraction may have important implications for Pop III evolution and nucleosynthesis, as well as the final fate of the first stars.

  15. Pseudo-populations a basic concept in statistical surveys

    CERN Document Server

    Quatember, Andreas

    2015-01-01

    This book emphasizes that artificial or pseudo-populations play an important role in statistical surveys from finite universes in two manners: firstly, the concept of pseudo-populations may substantially improve users’ understanding of various aspects in the sampling theory and survey methodology; an example of this scenario is the Horvitz-Thompson estimator. Secondly, statistical procedures exist in which pseudo-populations actually have to be generated. An example of such a scenario can be found in simulation studies in the field of survey sampling, where close-to-reality pseudo-populations are generated from known sample and population data to form the basis for the simulation process. The chapters focus on estimation methods, sampling techniques, nonresponse, questioning designs and statistical disclosure control.This book is a valuable reference in understanding the importance of the pseudo-population concept and applying it in teaching and research.

  16. Data on education: from population statistics to epidemiological research

    DEFF Research Database (Denmark)

    Pallesen, Palle Bo; Tverborgvik, Torill; Rasmussen, Hanna Barbara

    2010-01-01

    BACKGROUND: Level of education is in many fields of research used as an indicator of social status. METHODS: Using Statistics Denmark's register for education and employment of the population, we examined highest completed education with a birth-cohort perspective focusing on people born between...... of population trends by use of extrapolated values, solutions are less obvious in epidemiological research using individual level data....

  17. The Statistical Modeling of the Trends Concerning the Romanian Population

    Directory of Open Access Journals (Sweden)

    Gabriela OPAIT

    2014-11-01

    Full Text Available This paper reflects the statistical modeling concerning the resident population in Romania, respectively the total of the romanian population, through by means of the „Least Squares Method”. Any country it develops by increasing of the population, respectively of the workforce, which is a factor of influence for the growth of the Gross Domestic Product (G.D.P.. The „Least Squares Method” represents a statistical technique for to determine the trend line of the best fit concerning a model.

  18. Travel for the 2004 American Statistical Association Biannual Radiation Meeting: "Radiation in Realistic Environments: Interactions Between Radiation and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, David J.

    2009-07-21

    The 16th ASA Conference on Radiation and Health, held June 27-30, 2004 in Beaver Creek, CO, offered a unique forum for discussing research related to the effects of radiation exposures on human health in a multidisciplinary setting. The Conference furnishes investigators in health related disciplines the opportunity to learn about new quantitative approaches to their problems and furnishes statisticians the opportunity to learn about new applications for their discipline. The Conference was attended by about 60 scientists including statisticians, epidemiologists, biologists and physicists interested in radiation research. For the first time, ten recipients of Young Investigator Awards participated in the conference. The Conference began with a debate on the question: “Do radiation doses below 1 cGy increase cancer risks?” The keynote speaker was Dr. Martin Lavin, who gave a banquet presentation on the timely topic “How important is ATM?” The focus of the 2004 Conference on Radiation and Health was Radiation in Realistic Environments: Interactions Between Radiation and Other Risk Modifiers. The sessions of the conference included: Radiation, Smoking, and Lung Cancer Interactions of Radiation with Genetic Factors: ATM Radiation, Genetics, and Epigenetics Radiotherapeutic Interactions The Conference on Radiation and Health is held bi-annually, and participants are looking forward to the 17th conference to be held in 2006.

  19. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis.

    Science.gov (United States)

    Ducrot, Virginie; Péry, Alexandre R R; Lagadic, Laurent

    2010-11-12

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment.

  20. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  1. Agent-based modeling traction force mediated compaction of cell-populated collagen gels using physically realistic fibril mechanics.

    Science.gov (United States)

    Reinhardt, James W; Gooch, Keith J

    2014-02-01

    Agent-based modeling was used to model collagen fibrils, composed of a string of nodes serially connected by links that act as Hookean springs. Bending mechanics are implemented as torsional springs that act upon each set of three serially connected nodes as a linear function of angular deflection about the central node. These fibrils were evaluated under conditions that simulated axial extension, simple three-point bending and an end-loaded cantilever. The deformation of fibrils under axial loading varied <0.001% from the analytical solution for linearly elastic fibrils. For fibrils between 100 μm and 200 μm in length experiencing small deflections, differences between simulated deflections and their analytical solutions were <1% for fibrils experiencing three-point bending and <7% for fibrils experiencing cantilever bending. When these new rules for fibril mechanics were introduced into a model that allowed for cross-linking of fibrils to form a network and the application of cell traction force, the fibrous network underwent macroscopic compaction and aligned between cells. Further, fibril density increased between cells to a greater extent than that observed macroscopically and appeared similar to matrical tracks that have been observed experimentally in cell-populated collagen gels. This behavior is consistent with observations in previous versions of the model that did not allow for the physically realistic simulation of fibril mechanics. The significance of the torsional spring constant value was then explored to determine its impact on remodeling of the simulated fibrous network. Although a stronger torsional spring constant reduced the degree of quantitative remodeling that occurred, the inclusion of torsional springs in the model was not necessary for the model to reproduce key qualitative aspects of remodeling, indicating that the presence of Hookean springs is essential for this behavior. These results suggest that traction force mediated matrix

  2. Statistics and predictions of population, energy and environment problems

    International Nuclear Information System (INIS)

    Sobajima, Makoto

    1999-03-01

    In the situation that world's population, especially in developing countries, is rapidly growing, humankind is facing to global problems that they cannot steadily live unless they find individual places to live, obtain foods, and peacefully get energy necessary for living for centuries. For this purpose, humankind has to think what behavior they should take in the finite environment, talk, agree and execute. Though energy has been long respected as a symbol for improving living, demanded and used, they have come to limit the use making the global environment more serious. If there is sufficient energy not loading cost to the environment. If nuclear energy regarded as such one sustain the resource for long and has market competitiveness. What situation of realization of compensating new energy is now in the case the use of nuclear energy is restricted by the society fearing radioactivity. If there are promising ones for the future. One concerning with the study of energy cannot go without knowing these. The statistical materials compiled here are thought to be useful for that purpose, and are collected mainly from ones viewing future prediction based on past practices. Studies on the prediction is so important to have future measures that these data bases are expected to be improved for better accuracy. (author)

  3. Water Fluoridation Statistics - Percent of PWS population receiving fluoridated water

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2000-2014. Water Fluoridation Statistics is a biennial report of the percentage and number of people receiving fluoridated water from 2000 through 2014, originally...

  4. Water Fluoridation Statistics - Percent of PWS population receiving fluoridated water

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2000-2014 Water Fluoridation Statistics is a biennial report of the percentage and number of people receiving fluoridated water from 2000 through 2014, originally...

  5. Are Statisticians Cold-Blooded Bosses? A New Perspective on the "Old" Concept of Statistical Population

    Science.gov (United States)

    Lu, Yonggang; Henning, Kevin S. S.

    2013-01-01

    Spurred by recent writings regarding statistical pragmatism, we propose a simple, practical approach to introducing students to a new style of statistical thinking that models nature through the lens of data-generating processes, not populations. (Contains 5 figures.)

  6. Statistical guidelines for detecting past population shifts using ancient DNA

    DEFF Research Database (Denmark)

    Mourier, Tobias; Ho, Simon Y. W.; Gilbert, Tom

    2012-01-01

    Populations carry a genetic signal of their demographic past, providing an opportunity for investigating the processes that shaped their evolution. Our ability to infer population histories can be enhanced by including ancient DNA data. Using serial-coalescent simulations and a range of both...... quantitative and temporal sampling schemes, we test the power of ancient mitochondrial sequences and nuclear single-nucleotide polymorphisms (SNPs) to detect past population bottlenecks. Within our simulated framework, mitochondrial sequences have only limited power to detect subtle bottlenecks and/or fast...... results provide useful guidelines for scaling sampling schemes and for optimizing our ability to infer past population dynamics. In addition, our results suggest that many ancient DNA studies may face power issues in detecting moderate demographic collapses and/or highly dynamic demographic shifts when...

  7. A Case Study in Elementary Statistics: The Florida Panther Population

    Science.gov (United States)

    Lazowski, Andrew; Stopper, Geffrey

    2013-01-01

    We describe a case study that was created to intertwine the fields of biology and mathematics. This project is given in an elementary probability and statistics course for non-math majors. Some goals of this case study include: to expose students to biology in a math course, to apply probability to real-life situations, and to display how far a…

  8. Dose conversion coefficients for monoenergetic electrons incident on a realistic human eye model with different lens cell populations.

    Science.gov (United States)

    Nogueira, P; Zankl, M; Schlattl, H; Vaz, P

    2011-11-07

    The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation-the germinative cells-absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.

  9. Dose conversion coefficients for monoenergetic electrons incident on a realistic human eye model with different lens cell populations

    International Nuclear Information System (INIS)

    Nogueira, P; Vaz, P; Zankl, M; Schlattl, H

    2011-01-01

    The radiation-induced posterior subcapsular cataract has long been generally accepted to be a deterministic effect that does not occur at doses below a threshold of at least 2 Gy. Recent epidemiological studies indicate that the threshold for cataract induction may be much lower or that there may be no threshold at all. A thorough study of this subject requires more accurate dose estimates for the eye lens than those available in ICRP Publication 74. Eye lens absorbed dose per unit fluence conversion coefficients for electron irradiation were calculated using a geometrical model of the eye that takes into account different cell populations of the lens epithelium, together with the MCNPX Monte Carlo radiation transport code package. For the cell population most sensitive to ionizing radiation-the germinative cells-absorbed dose per unit fluence conversion coefficients were determined that are up to a factor of 4.8 higher than the mean eye lens absorbed dose conversion coefficients for electron energies below 2 MeV. Comparison of the results with previously published values for a slightly different eye model showed generally good agreement for all electron energies. Finally, the influence of individual anatomical variability was quantified by positioning the lens at various depths below the cornea. A depth difference of 2 mm between the shallowest and the deepest location of the germinative zone can lead to a difference between the resulting absorbed doses of up to nearly a factor of 5000 for electron energy of 0.7 MeV.

  10. Modeling individual movement decisions of brown hare (Lepus europaeus) as a key concept for realistic spatial behavior and exposure: A population model for landscape-level risk assessment.

    Science.gov (United States)

    Kleinmann, Joachim U; Wang, Magnus

    2017-09-01

    Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.

  11. Microcomputer package for statistical analysis of microbial populations.

    Science.gov (United States)

    Lacroix, J M; Lavoie, M C

    1987-11-01

    We have developed a Pascal system to compare microbial populations from different ecological sites using microcomputers. The values calculated are: the coverage value and its standard error, the minimum similarity and the geometric similarity between two biological samples, and the Lambda test consisting of calculating the ratio of the mean similarity between two subsets by the mean similarity within subsets. This system is written for Apple II, IBM or compatible computers, but it can work for any computer which can use CP/M, if the programs are recompiled for such a system.

  12. Alternating event processes during lifetimes: population dynamics and statistical inference.

    Science.gov (United States)

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  13. Evaluation of quantitative imaging methods for organ activity and residence time estimation using a population of phantoms having realistic variations in anatomy and uptake

    International Nuclear Information System (INIS)

    He Bin; Du Yong; Segars, W. Paul; Wahl, Richard L.; Sgouros, George; Jacene, Heather; Frey, Eric C.

    2009-01-01

    Estimating organ residence times is an essential part of patient-specific dosimetry for radioimmunotherapy (RIT). Quantitative imaging methods for RIT are often evaluated using a single physical or simulated phantom but are intended to be applied clinically where there is variability in patient anatomy, biodistribution, and biokinetics. To provide a more relevant evaluation, the authors have thus developed a population of phantoms with realistic variations in these factors and applied it to the evaluation of quantitative imaging methods both to find the best method and to demonstrate the effects of these variations. Using whole body scans and SPECT/CT images, organ shapes and time-activity curves of 111In ibritumomab tiuxetan were measured in dosimetrically important organs in seven patients undergoing a high dose therapy regimen. Based on these measurements, we created a 3D NURBS-based cardiac-torso (NCAT)-based phantom population. SPECT and planar data at realistic count levels were then simulated using previously validated Monte Carlo simulation tools. The projections from the population were used to evaluate the accuracy and variation in accuracy of residence time estimation methods that used a time series of SPECT and planar scans. Quantitative SPECT (QSPECT) reconstruction methods were used that compensated for attenuation, scatter, and the collimator-detector response. Planar images were processed with a conventional (CPlanar) method that used geometric mean attenuation and triple-energy window scatter compensation and a quantitative planar (QPlanar) processing method that used model-based compensation for image degrading effects. Residence times were estimated from activity estimates made at each of five time points. The authors also evaluated hybrid methods that used CPlanar or QPlanar time-activity curves rescaled to the activity estimated from a single QSPECT image. The methods were evaluated in terms of mean relative error and standard deviation of the

  14. The aging population in the twenty-first century: statistics for health policy

    National Research Council Canada - National Science Library

    Gilford, Dorothy M

    1988-01-01

    ... on Statistics for an Aging Population Sam Shapiro, Chair Committee on National Statistics Commission on Behavioral and Social Sciences and Education National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1988 Copyrightoriginal retained, the be not from cannot book, paper original however, for version formatting, authoritative...

  15. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  16. An audit of the statistics and the comparison with the parameter in the population

    Science.gov (United States)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  17. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  18. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  2. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  3. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  4. Directions for new developments on statistical design and analysis of small population group trials.

    Science.gov (United States)

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small

  5. A population based statistical model for daily geometric variations in the thorax

    NARCIS (Netherlands)

    Szeto, Yenny Z.; Witte, Marnix G.; van Herk, Marcel; Sonke, Jan-Jakob

    2017-01-01

    To develop a population based statistical model of the systematic interfraction geometric variations between the planning CT and first treatment week of lung cancer patients for inclusion as uncertainty term in future probabilistic planning. Deformable image registrations between the planning CT and

  6. Radiation belt seed population and its association with the relativistic electron dynamics: A statistical study: Radiation Belt Seed Population

    International Nuclear Information System (INIS)

    Tang, C. L.; Wang, Y. X.; Ni, B.; Zhang, J.-C.

    2017-01-01

    Using the Van Allen Probes data, we study the radiation belt seed population and it associated with the relativistic electron dynamics during 74 geomagnetic storm events. Based on the flux changes of 1 MeV electrons before and after the storm peak, these storm events are divided into two groups of “non-preconditioned” and “preconditioned”. The statistical study shows that the storm intensity is of significant importance for the distribution of the seed population (336 keV electrons) in the outer radiation belt. However, substorm intensity can also be important to the evolution of the seed population for some geomagnetic storm events. For non-preconditioned storm events, the correlation between the peak fluxes and their L-shell locations of the seed population and relativistic electrons (592 keV, 1.0 MeV, 1.8 MeV, and 2.1 MeV) is consistent with the energy-dependent dynamic processes in the outer radiation belt. For preconditioned storm events, the correlation between the features of the seed population and relativistic electrons is not fully consistent with the energy-dependent processes. It is suggested that the good correlation between the radiation belt seed population and ≤1.0 MeV electrons contributes to the prediction of the evolution of ≤1.0 MeV electrons in the Earth’s outer radiation belt during periods of geomagnetic storms.

  7. Population activity statistics dissect subthreshold and spiking variability in V1.

    Science.gov (United States)

    Bányai, Mihály; Koman, Zsombor; Orbán, Gergő

    2017-07-01

    Response variability, as measured by fluctuating responses upon repeated performance of trials, is a major component of neural responses, and its characterization is key to interpret high dimensional population recordings. Response variability and covariability display predictable changes upon changes in stimulus and cognitive or behavioral state, providing an opportunity to test the predictive power of models of neural variability. Still, there is little agreement on which model to use as a building block for population-level analyses, and models of variability are often treated as a subject of choice. We investigate two competing models, the doubly stochastic Poisson (DSP) model assuming stochasticity at spike generation, and the rectified Gaussian (RG) model tracing variability back to membrane potential variance, to analyze stimulus-dependent modulation of both single-neuron and pairwise response statistics. Using a pair of model neurons, we demonstrate that the two models predict similar single-cell statistics. However, DSP and RG models have contradicting predictions on the joint statistics of spiking responses. To test the models against data, we build a population model to simulate stimulus change-related modulations in pairwise response statistics. We use single-unit data from the primary visual cortex (V1) of monkeys to show that while model predictions for variance are qualitatively similar to experimental data, only the RG model's predictions are compatible with joint statistics. These results suggest that models using Poisson-like variability might fail to capture important properties of response statistics. We argue that membrane potential-level modeling of stochasticity provides an efficient strategy to model correlations. NEW & NOTEWORTHY Neural variability and covariability are puzzling aspects of cortical computations. For efficient decoding and prediction, models of information encoding in neural populations hinge on an appropriate model of

  8. A statistical method for testing epidemiological results, as applied to the Hanford worker population

    International Nuclear Information System (INIS)

    Brodsky, A.

    1979-01-01

    Some recent reports of Mancuso, Stewart and Kneale claim findings of radiation-produced cancer in the Hanford worker population. These claims are based on statistical computations that use small differences in accumulated exposures between groups dying of cancer and groups dying of other causes; actual mortality and longevity were not reported. This paper presents a statistical method for evaluation of actual mortality and longevity longitudinally over time, as applied in a primary analysis of the mortality experience of the Hanford worker population. Although available, this method was not utilized in the Mancuso-Stewart-Kneale paper. The author's preliminary longitudinal analysis shows that the gross mortality experience of persons employed at Hanford during 1943-70 interval did not differ significantly from that of certain controls, when both employees and controls were selected from families with two or more offspring and comparison were matched by age, sex, race and year of entry into employment. This result is consistent with findings reported by Sanders (Health Phys. vol.35, 521-538, 1978). The method utilizes an approximate chi-square (1 D.F.) statistic for testing population subgroup comparisons, as well as the cumulation of chi-squares (1 D.F.) for testing the overall result of a particular type of comparison. The method is available for computer testing of the Hanford mortality data, and could also be adapted to morbidity or other population studies. (author)

  9. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  10. Reconciling Dwarf Galaxies with ΛCDM Cosmology: Simulating A Realistic Population of Satellites Around a Milky Way-Mass Galaxy

    OpenAIRE

    Wetzel, Andrew R.; Hopkins, Philip F.; Kim, Ji-Hoon; Faucher-Giguère, Claude-André; Kereš, Dušan; Quataert, Eliot

    2016-01-01

    � 2016. The American Astronomical Society. All rights reserved. Low-mass "dwarf" galaxies represent the most significant challenges to the cold dark matter (CDM) model of cosmological structure formation. Because these faint galaxies are (best) observed within the Local Group (LG) of the Milky Way (MW) and Andromeda (M31), understanding their formation in such an environment is critical. We present first results from the Latte Project: the Milky Way on Feedback in Realistic Environments (FI...

  11. Cross-population validation of statistical distance as a measure of physiological dysregulation during aging.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi

    2014-09-01

    Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. 4P: fast computing of population genetics statistics from large DNA polymorphism panels.

    Science.gov (United States)

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations.

  13. A statistical framework for the validation of a population exposure model based on personal exposure data

    Science.gov (United States)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  14. Statistical characteristics of dynamics for population migration driven by the economic interests

    Science.gov (United States)

    Huo, Jie; Wang, Xu-Ming; Zhao, Ning; Hao, Rui

    2016-06-01

    Population migration typically occurs under some constraints, which can deeply affect the structure of a society and some other related aspects. Therefore, it is critical to investigate the characteristics of population migration. Data from the China Statistical Yearbook indicate that the regional gross domestic product per capita relates to the population size via a linear or power-law relation. In addition, the distribution of population migration sizes or relative migration strength introduced here is dominated by a shifted power-law relation. To reveal the mechanism that creates the aforementioned distributions, a dynamic model is proposed based on the population migration rule that migration is facilitated by higher financial gains and abated by fewer employment opportunities at the destination, considering the migration cost as a function of the migration distance. The calculated results indicate that the distribution of the relative migration strength is governed by a shifted power-law relation, and that the distribution of migration distances is dominated by a truncated power-law relation. These results suggest the use of a power-law to fit a distribution may be not always suitable. Additionally, from the modeling framework, one can infer that it is the randomness and determinacy that jointly create the scaling characteristics of the distributions. The calculation also demonstrates that the network formed by active nodes, representing the immigration and emigration regions, usually evolves from an ordered state with a non-uniform structure to a disordered state with a uniform structure, which is evidenced by the increasing structural entropy.

  15. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats.Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions.Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  16. A statistical assessment of population trends for data deficient Mexican amphibians.

    Science.gov (United States)

    Quintero, Esther; Thessen, Anne E; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara

    2014-01-01

    Background. Mexico has the world's fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species' risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  17. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    Science.gov (United States)

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  18. Non-statistically populated autoionizing levels of Li-like carbon: Hidden-crossings

    International Nuclear Information System (INIS)

    Deveney, E.F.; Krause, H.F.; Jones, N.L.

    1995-01-01

    The intensities of the Auger-electron lines from autoionizing (AI) states of Li-like (1s2s2l) configurations excited in ion-atom collisions vary as functions of the collision parameters such as, for example, the collision velocity. A statistical population of the three-electron levels is at best incomplete and underscores the intricate dynamical development of the electronic states. The authors compare several experimental studies to calculations using ''hidden-crossing'' techniques to explore some of the details of these Auger-electron intensity variation phenomena. The investigations show promising results suggesting that Auger-electron intensity variations can be used to probe collision dynamics

  19. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    method provides a simple goodness of t test by comparing the observed SFS with the expected SFS under a given model of population size changes. By the use of Monte Carlo estimation the expected time between coalescent events can be estimated and the expected SFS can thereby be evaluated. Using......). The OR is interpreted as the eect of an exposure on the probability of being diseased at the end of follow-up, while the interpretation of the IRR is the eect of an exposure on the probability of becoming diseased. Through a simulation study, the OR from a classical case-control study is shown to be an inconsistent...... the classical chi-square statistics we are able to infer single parameter models. Multiple parameter models, e.g. multiple epochs, are harder to identify. By introducing the inference of population size back in time as an inverse problem, the second procedure applies the theory of smoothing splines to infer...

  20. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    Science.gov (United States)

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. 15 CFR 50.40 - Fee structure for statistics for city blocks in the 1980 Census of Population and Housing.

    Science.gov (United States)

    2010-01-01

    ... blocks in the 1980 Census of Population and Housing. 50.40 Section 50.40 Commerce and Foreign Trade... the 1980 Census of Population and Housing. (a) As part of the regular program of the 1980 census, the Census Bureau will publish printed reports containing certain summary population and housing statistics...

  2. Can Family Planning Service Statistics Be Used to Track Population-Level Outcomes?

    Science.gov (United States)

    Magnani, Robert J; Ross, John; Williamson, Jessica; Weinberger, Michelle

    2018-03-21

    The need for annual family planning program tracking data under the Family Planning 2020 (FP2020) initiative has contributed to renewed interest in family planning service statistics as a potential data source for annual estimates of the modern contraceptive prevalence rate (mCPR). We sought to assess (1) how well a set of commonly recorded data elements in routine service statistics systems could, with some fairly simple adjustments, track key population-level outcome indicators, and (2) whether some data elements performed better than others. We used data from 22 countries in Africa and Asia to analyze 3 data elements collected from service statistics: (1) number of contraceptive commodities distributed to clients, (2) number of family planning service visits, and (3) number of current contraceptive users. Data quality was assessed via analysis of mean square errors, using the United Nations Population Division World Contraceptive Use annual mCPR estimates as the "gold standard." We also examined the magnitude of several components of measurement error: (1) variance, (2) level bias, and (3) slope (or trend) bias. Our results indicate modest levels of tracking error for data on commodities to clients (7%) and service visits (10%), and somewhat higher error rates for data on current users (19%). Variance and slope bias were relatively small for all data elements. Level bias was by far the largest contributor to tracking error. Paired comparisons of data elements in countries that collected at least 2 of the 3 data elements indicated a modest advantage of data on commodities to clients. None of the data elements considered was sufficiently accurate to be used to produce reliable stand-alone annual estimates of mCPR. However, the relatively low levels of variance and slope bias indicate that trends calculated from these 3 data elements can be productively used in conjunction with the Family Planning Estimation Tool (FPET) currently used to produce annual m

  3. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  4. A statistical investigation into the stability of iris recognition in diverse population sets

    Science.gov (United States)

    Howard, John J.; Etter, Delores M.

    2014-05-01

    Iris recognition is increasingly being deployed on population wide scales for important applications such as border security, social service administration, criminal identification and general population management. The error rates for this incredibly accurate form of biometric identification are established using well known, laboratory quality datasets. However, it is has long been acknowledged in biometric theory that not all individuals have the same likelihood of being correctly serviced by a biometric system. Typically, techniques for identifying clients that are likely to experience a false non-match or a false match error are carried out on a per-subject basis. This research makes the novel hypothesis that certain ethnical denominations are more or less likely to experience a biometric error. Through established statistical techniques, we demonstrate this hypothesis to be true and document the notable effect that the ethnicity of the client has on iris similarity scores. Understanding the expected impact of ethnical diversity on iris recognition accuracy is crucial to the future success of this technology as it is deployed in areas where the target population consists of clientele from a range of geographic backgrounds, such as border crossings and immigration check points.

  5. Census 2012 Core Based Statistical Area (CBSAs) Polygons with Population Estimates, US EPA Region 9, 2014, USCB

    Data.gov (United States)

    U.S. Environmental Protection Agency — Core Based Statistical Areas (CBSAs) from the US Census Bureau's TIGER files download website, joined with 2014 population estimate data downloaded from the US...

  6. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  7. Realistic Material Appearance Modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Filip, Jiří; Hatka, Martin

    2010-01-01

    Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf

  8. DHLAS: A web-based information system for statistical genetic analysis of HLA population data.

    Science.gov (United States)

    Thriskos, P; Zintzaras, E; Germenis, A

    2007-03-01

    DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.

  9. Population-based statistical inference for temporal sequence of somatic mutations in cancer genomes.

    Science.gov (United States)

    Rhee, Je-Keun; Kim, Tae-Min

    2018-04-20

    It is well recognized that accumulation of somatic mutations in cancer genomes plays a role in carcinogenesis; however, the temporal sequence and evolutionary relationship of somatic mutations remain largely unknown. In this study, we built a population-based statistical framework to infer the temporal sequence of acquisition of somatic mutations. Using the model, we analyzed the mutation profiles of 1954 tumor specimens across eight tumor types. As a result, we identified tumor type-specific directed networks composed of 2-15 cancer-related genes (nodes) and their mutational orders (edges). The most common ancestors identified in pairwise comparison of somatic mutations were TP53 mutations in breast, head/neck, and lung cancers. The known relationship of KRAS to TP53 mutations in colorectal cancers was identified, as well as potential ancestors of TP53 mutation such as NOTCH1, EGFR, and PTEN mutations in head/neck, lung and endometrial cancers, respectively. We also identified apoptosis-related genes enriched with ancestor mutations in lung cancers and a relationship between APC hotspot mutations and TP53 mutations in colorectal cancers. While evolutionary analysis of cancers has focused on clonal versus subclonal mutations identified in individual genomes, our analysis aims to further discriminate ancestor versus descendant mutations in population-scale mutation profiles that may help select cancer drivers with clinical relevance.

  10. A Statistical Model for Generating a Population of Unclassified Objects and Radiation Signatures Spanning Nuclear Threats

    International Nuclear Information System (INIS)

    Nelson, K.; Sokkappa, P.

    2008-01-01

    This report describes an approach for generating a simulated population of plausible nuclear threat radiation signatures spanning a range of variability that could be encountered by radiation detection systems. In this approach, we develop a statistical model for generating random instances of smuggled nuclear material. The model is based on physics principles and bounding cases rather than on intelligence information or actual threat device designs. For this initial stage of work, we focus on random models using fissile material and do not address scenarios using non-fissile materials. The model has several uses. It may be used as a component in a radiation detection system performance simulation to generate threat samples for injection studies. It may also be used to generate a threat population to be used for training classification algorithms. In addition, we intend to use this model to generate an unclassified 'benchmark' threat population that can be openly shared with other organizations, including vendors, for use in radiation detection systems performance studies and algorithm development and evaluation activities. We assume that a quantity of fissile material is being smuggled into the country for final assembly and that shielding may have been placed around the fissile material. In terms of radiation signature, a nuclear weapon is basically a quantity of fissile material surrounded by various layers of shielding. Thus, our model of smuggled material is expected to span the space of potential nuclear weapon signatures as well. For computational efficiency, we use a generic 1-dimensional spherical model consisting of a fissile material core surrounded by various layers of shielding. The shielding layers and their configuration are defined such that the model can represent the potential range of attenuation and scattering that might occur. The materials in each layer and the associated parameters are selected from probability distributions that span the

  11. Margin improvement initiatives: realistic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chan, P.K.; Paquette, S. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Cunning, T.A. [Department of National Defence, Ottawa, ON (Canada); French, C.; Bonin, H.W. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Pandey, M. [Univ. of Waterloo, Waterloo, ON (Canada); Murchie, M. [Cameco Fuel Manufacturing, Port Hope, ON (Canada)

    2014-07-01

    With reactor core aging, safety margins are particularly tight. Two realistic and practical approaches are proposed here to recover margins. The first project is related to the use of a small amount of neutron absorbers in CANDU Natural Uranium (NU) fuel bundles. Preliminary results indicate that the fuelling transient and subsequent reactivity peak can be lowered to improve the reactor's operating margins, with minimal impact on burnup when less than 1000 mg of absorbers is added to a fuel bundle. The second project involves the statistical analysis of fuel manufacturing data to demonstrate safety margins. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to generate input for ELESTRES and ELOCA. It is found that the fuel response distributions are far below industrial failure limits, implying that margin exists in the current fuel design. (author)

  12. Getting realistic; Endstation Demut

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2004-01-28

    The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)

  13. Segmenting CT prostate images using population and patient-specific statistics for radiotherapy

    International Nuclear Information System (INIS)

    Feng, Qianjin; Foskey, Mark; Chen Wufan; Shen Dinggang

    2010-01-01

    Purpose: In the segmentation of sequential treatment-time CT prostate images acquired in image-guided radiotherapy, accurately capturing the intrapatient variation of the patient under therapy is more important than capturing interpatient variation. However, using the traditional deformable-model-based segmentation methods, it is difficult to capture intrapatient variation when the number of samples from the same patient is limited. This article presents a new deformable model, designed specifically for segmenting sequential CT images of the prostate, which leverages both population and patient-specific statistics to accurately capture the intrapatient variation of the patient under therapy. Methods: The novelty of the proposed method is twofold: First, a weighted combination of gradient and probability distribution function (PDF) features is used to build the appearance model to guide model deformation. The strengths of each feature type are emphasized by dynamically adjusting the weight between the profile-based gradient features and the local-region-based PDF features during the optimization process. An additional novel aspect of the gradient-based features is that, to alleviate the effect of feature inconsistency in the regions of gas and bone adjacent to the prostate, the optimal profile length at each landmark is calculated by statistically investigating the intensity profile in the training set. The resulting gradient-PDF combined feature produces more accurate and robust segmentations than general gradient features. Second, an online learning mechanism is used to build shape and appearance statistics for accurately capturing intrapatient variation. Results: The performance of the proposed method was evaluated on 306 images of the 24 patients. Compared to traditional gradient features, the proposed gradient-PDF combination features brought 5.2% increment in the success ratio of segmentation (from 94.1% to 99.3%). To evaluate the effectiveness of online

  14. Segmenting CT prostate images using population and patient-specific statistics for radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Qianjin; Foskey, Mark; Chen Wufan; Shen Dinggang [Biomedical Engineering College, South Medical University, Guangzhou (China) and Department of Radiology, University of North Carolina, Chapel Hill, North Carolina 27510 (United States); Department of Radiation Oncology, University of North Carolina, Chapel Hill, North Carolina 27599 (United States); Biomedical Engineering College, South Medical University, Guangzhou 510510 (China); Department of Radiology, University of North Carolina, Chapel Hill, North Carolina 27510 (United States)

    2010-08-15

    Purpose: In the segmentation of sequential treatment-time CT prostate images acquired in image-guided radiotherapy, accurately capturing the intrapatient variation of the patient under therapy is more important than capturing interpatient variation. However, using the traditional deformable-model-based segmentation methods, it is difficult to capture intrapatient variation when the number of samples from the same patient is limited. This article presents a new deformable model, designed specifically for segmenting sequential CT images of the prostate, which leverages both population and patient-specific statistics to accurately capture the intrapatient variation of the patient under therapy. Methods: The novelty of the proposed method is twofold: First, a weighted combination of gradient and probability distribution function (PDF) features is used to build the appearance model to guide model deformation. The strengths of each feature type are emphasized by dynamically adjusting the weight between the profile-based gradient features and the local-region-based PDF features during the optimization process. An additional novel aspect of the gradient-based features is that, to alleviate the effect of feature inconsistency in the regions of gas and bone adjacent to the prostate, the optimal profile length at each landmark is calculated by statistically investigating the intensity profile in the training set. The resulting gradient-PDF combined feature produces more accurate and robust segmentations than general gradient features. Second, an online learning mechanism is used to build shape and appearance statistics for accurately capturing intrapatient variation. Results: The performance of the proposed method was evaluated on 306 images of the 24 patients. Compared to traditional gradient features, the proposed gradient-PDF combination features brought 5.2% increment in the success ratio of segmentation (from 94.1% to 99.3%). To evaluate the effectiveness of online

  15. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  16. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  17. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  18. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  19. Realistic Planning Scenarios.

    Science.gov (United States)

    1987-07-01

    independent multiracial government, dominated primarily by the Zulu tribe and the local Asian population, had been proclaimed and aspired to control all of the...concentrated most of South Africa’s - remaining English-speaking population, and by the reigning Chief of the Zulu tribe , speaking for the self-styled...Africa. Facilities in one or more northern African countries-- Morocco, Egypt, Sudan, Kenya, Somalia--could be critical to U.S. military actions in the

  20. A comparison of statistical methods for genomic selection in a mice population

    Directory of Open Access Journals (Sweden)

    Neves Haroldo HR

    2012-11-01

    Full Text Available Abstract Background The availability of high-density panels of SNP markers has opened new perspectives for marker-assisted selection strategies, such that genotypes for these markers are used to predict the genetic merit of selection candidates. Because the number of markers is often much larger than the number of phenotypes, marker effect estimation is not a trivial task. The objective of this research was to compare the predictive performance of ten different statistical methods employed in genomic selection, by analyzing data from a heterogeneous stock mice population. Results For the five traits analyzed (W6W: weight at six weeks, WGS: growth slope, BL: body length, %CD8+: percentage of CD8+ cells, CD4+/ CD8+: ratio between CD4+ and CD8+ cells, within-family predictions were more accurate than across-family predictions, although this superiority in accuracy varied markedly across traits. For within-family prediction, two kernel methods, Reproducing Kernel Hilbert Spaces Regression (RKHS and Support Vector Regression (SVR, were the most accurate for W6W, while a polygenic model also had comparable performance. A form of ridge regression assuming that all markers contribute to the additive variance (RR_GBLUP figured among the most accurate for WGS and BL, while two variable selection methods ( LASSO and Random Forest, RF had the greatest predictive abilities for %CD8+ and CD4+/ CD8+. RF, RKHS, SVR and RR_GBLUP outperformed the remainder methods in terms of bias and inflation of predictions. Conclusions Methods with large conceptual differences reached very similar predictive abilities and a clear re-ranking of methods was observed in function of the trait analyzed. Variable selection methods were more accurate than the remainder in the case of %CD8+ and CD4+/CD8+ and these traits are likely to be influenced by a smaller number of QTL than the remainder. Judged by their overall performance across traits and computational requirements, RR

  1. Statistical Support for Analysis of the Social Stratification and Economic Inequality of the Country’s Population

    Directory of Open Access Journals (Sweden)

    Aksyonova Irina V.

    2017-12-01

    Full Text Available The aim of the article is to summarize the theoretical and methodological as well as information and analytical support for statistical research of economic and social stratification in society and conduct an analysis of the differentiation of the population of Ukraine in terms of the economic component of social inequality. The theoretical and methodological level of the research is studied, and criteria for social stratification and inequalities in society, systems, models and theories of social stratification of the population are singled out. The indicators of social and economic statistics regarding the differentiation of the population by income level are considered as the research tools. As a result of the analysis it was concluded that the economic inequality of the population leads to changes in the social structure, which requires formation of a new social stratification of society. The basis of social stratification is indicators of the population well-being, which require a comprehensive study. Prospects for further research in this area are the analysis of the components of economic inequality that determine and influence the social stratification of the population of the country, the formation of the middle class, and the study of the components of the human development index as a cross-currency indicator of the socio-economic inequality of the population.

  2. Derivation of cell population kinetic parameters from clinical statistical data (program RAD3)

    International Nuclear Information System (INIS)

    Cohen, L.

    1978-01-01

    Cellular lethality models generally require up to 6 parameters to simulate a clinical course of fractionated radiation therapy and to derive an estimate of the cellular surviving fraction for a given treatment scheme. These parameters are the mean cellular lethal dose, the extrapolation number, the ratio of sublethal to irreparable events, the regeneration rate, the repopulation limit (cell cycles), and a field-size or tumor-volume factor. A computer program (RAD3) was designed to derive best-fitting values for these parameters in relation to available clinical data based on the assumption that if a number of different fractionation schemes yield similar reactions, the cellular surviving fractions will be about equal in each instance. Parameters were derived for a variety of human tissues from which realistic iso-effect functions could be generated

  3. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  4. [Analysis of the definitive statistics of the 11th General Population and Housing Census].

    Science.gov (United States)

    Aguayo Hernandez, J R

    1992-01-01

    The 11th General Census of Population and Housing conducted in March 1990 enumerated 2,204,054 inhabitants in Sinaloa, for a density of 37.9 per sq. km. Sinaloa's population thus increased sevenfold from 297,000 in 1900. The proportion of Sinalioans in Mexico's population increased from 2.2% in 1900 to 2.7% in 1990. 38.4% of the population was under age 14, 57.0% was 14064, and 4.6% as over 65. The greatest challenge for the year 2010 will be to meet the demand for educational facilities, employment, and services for the growing elderly population. Sinaloa's population grew at an annual rate of 1.1 between 1980-90. 17 of its 18 municipios showed slowing growth rates between 1980-90, with only Escuinapa increasing its rate. Sinaloa's growth rate of 1.8% is still relatively high, and the population in the year 2000 is projected at 2.6 million. Population distribution and migration present problems that should be more actively addressed. Urban-urban migration is increasing in importance. In 1990, Sinaloa had 5247 localities of which only 85 had more than 2500 inhabitants and 4717 had fewer than 500. Growth of midsize localities with 500-2499 inhabitants may constitute an alternative allowing the demographic deconcentration and decentralization that Sinaloa urgently requires. The lack of jobs, infrastructure, educational and health services, housing, and food in the dispersed 4717 communities with fewer than 500 inhabitants makes them sources of emigration. Sinaloa's population is concentrated along the coast and in the 3 valleys of the north and central regions, which contain 80.8% of the population. One-third of the population lives on 12.1% of the territory in 2 municipios, while 12 municipios covering 67% of the territory contain just 24% of the population. Sinaloa's growth rate has declined from 4.3% between 1960-70 to 3.7% from 1970-80 and 1.8% in 1980-90.

  5. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    Science.gov (United States)

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  6. ddClone: joint statistical inference of clonal populations from single cell and bulk tumour sequencing data.

    Science.gov (United States)

    Salehi, Sohrab; Steif, Adi; Roth, Andrew; Aparicio, Samuel; Bouchard-Côté, Alexandre; Shah, Sohrab P

    2017-03-01

    Next-generation sequencing (NGS) of bulk tumour tissue can identify constituent cell populations in cancers and measure their abundance. This requires computational deconvolution of allelic counts from somatic mutations, which may be incapable of fully resolving the underlying population structure. Single cell sequencing (SCS) is a more direct method, although its replacement of NGS is impeded by technical noise and sampling limitations. We propose ddClone, which analytically integrates NGS and SCS data, leveraging their complementary attributes through joint statistical inference. We show on real and simulated datasets that ddClone produces more accurate results than can be achieved by either method alone.

  7. Population Genomics and the Statistical Values of Race:An Interdisciplinary Perspective on the Biological Classification of Human Populations and Implications for Clinical Genetic Epidemiological Research

    Directory of Open Access Journals (Sweden)

    Koffi N. Maglo

    2016-02-01

    Full Text Available The biological status and biomedical significance of the concept of race as applied to humans continue to be contentious issues despite the use of advanced statistical and clustering methods to determine continental ancestry. It is thus imperative for researchers to understand the limitations as well as potential uses of the concept of race in biology and biomedicine. This paper deals with the theoretical assumptions behind cluster analysis in human population genomics. Adopting an interdisciplinary approach, it demonstrates that the hypothesis that attributes the clustering of human populations to frictional effects of landform barriers at continental boundaries is empirically incoherent. It then contrasts the scientific status of the cluster and cline constructs in human population genomics, and shows how cluster may be instrumentally produced. It also shows how statistical values of race vindicate Darwin’s argument that race is evolutionarily meaningless. Finally, the paper explains why, due to spatiotemporal parameters, evolutionary forces and socio-cultural factors influencing population structure, continental ancestry may be pragmatically relevant to global and public health genomics. Overall, this work demonstrates that, from a biological systematic and evolutionary taxonomical perspective, human races/continental groups or clusters have no natural meaning or objective biological reality. In fact, the utility of racial categorizations in research and in clinics can be explained by spatiotemporal parameters, socio-cultural factors and evolutionary forces affecting disease causation and treatment response.

  8. Population-based cancer survival in the United States: Data, quality control, and statistical methods.

    Science.gov (United States)

    Allemani, Claudia; Harewood, Rhea; Johnson, Christopher J; Carreira, Helena; Spika, Devon; Bonaventure, Audrey; Ward, Kevin; Weir, Hannah K; Coleman, Michel P

    2017-12-15

    Robust comparisons of population-based cancer survival estimates require tight adherence to the study protocol, standardized quality control, appropriate life tables of background mortality, and centralized analysis. The CONCORD program established worldwide surveillance of population-based cancer survival in 2015, analyzing individual data on 26 million patients (including 10 million US patients) diagnosed between 1995 and 2009 with 1 of 10 common malignancies. In this Cancer supplement, we analyzed data from 37 state cancer registries that participated in the second cycle of the CONCORD program (CONCORD-2), covering approximately 80% of the US population. Data quality checks were performed in 3 consecutive phases: protocol adherence, exclusions, and editorial checks. One-, 3-, and 5-year age-standardized net survival was estimated using the Pohar Perme estimator and state- and race-specific life tables of all-cause mortality for each year. The cohort approach was adopted for patients diagnosed between 2001 and 2003, and the complete approach for patients diagnosed between 2004 and 2009. Articles in this supplement report population coverage, data quality indicators, and age-standardized 5-year net survival by state, race, and stage at diagnosis. Examples of tables, bar charts, and funnel plots are provided in this article. Population-based cancer survival is a key measure of the overall effectiveness of services in providing equitable health care. The high quality of US cancer registry data, 80% population coverage, and use of an unbiased net survival estimator ensure that the survival trends reported in this supplement are robustly comparable by race and state. The results can be used by policymakers to identify and address inequities in cancer survival in each state and for the United States nationally. Cancer 2017;123:4982-93. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U

  9. Non-Gaussianity and statistical anisotropy from vector field populated inflationary models

    CERN Document Server

    Dimastrogiovanni, Emanuela; Matarrese, Sabino; Riotto, Antonio

    2010-01-01

    We present a review of vector field models of inflation and, in particular, of the statistical anisotropy and non-Gaussianity predictions of models with SU(2) vector multiplets. Non-Abelian gauge groups introduce a richer amount of predictions compared to the Abelian ones, mostly because of the presence of vector fields self-interactions. Primordial vector fields can violate isotropy leaving their imprint in the comoving curvature fluctuations zeta at late times. We provide the analytic expressions of the correlation functions of zeta up to fourth order and an analysis of their amplitudes and shapes. The statistical anisotropy signatures expected in these models are important and, potentially, the anisotropic contributions to the bispectrum and the trispectrum can overcome the isotropic parts.

  10. CONSTRUCTION OF STATISTICAL MODEL THE OVERALL POPULATION OF THE RUSSIAN FEDERATION ON THE BASIS OF RETROSPECTIVE FORECAST

    Directory of Open Access Journals (Sweden)

    Ol’ga Sergeevna Kochegarova

    2017-06-01

    Full Text Available The article considers the retrospective forecast of the total population of the Russian Federation for the period 2001–2017. comparative analysis of the actual values of the total population of the Russian Federation on 20.03.2017 according to the Federal state statistics service of the Russian Federation received the forecast value. Model selection forecasting was carried out by the method of selection of growth curves on the basis of correlation and regression analysis and least squares method. A quality selection of the regression equation was determined with the least error of approximation of time series levels. Analysis of the significance of the selected regression equation by statistical methods allows to make a conclusion about the right choice of models and the possibility of its use for population estimates. Purpose: to estimate the significance of selected regression equations for the forecast of the population. Methodology in article: the fitting of growth curves on the basis of correlation and regression analysis and least squares method. Results: received confirmation of the effectiveness of the constructed model for forecasts of demographic processes. Practical implications: the obtained results should be used when building forecasts of demographic processes.

  11. Implementing a generic method for bias correction in statistical models using random effects, with spatial and population dynamics examples

    DEFF Research Database (Denmark)

    Thorson, James T.; Kristensen, Kasper

    2016-01-01

    Statistical models play an important role in fisheries science when reconciling ecological theory with available data for wild populations or experimental studies. Ecological models increasingly include both fixed and random effects, and are often estimated using maximum likelihood techniques...... configurations of an age-structured population dynamics model. This simulation experiment shows that the epsilon-method and the existing bias-correction method perform equally well in data-rich contexts, but the epsilon-method is slightly less biased in data-poor contexts. We then apply the epsilon......-method to a spatial regression model when estimating an index of population abundance, and compare results with an alternative bias-correction algorithm that involves Markov-chain Monte Carlo sampling. This example shows that the epsilon-method leads to a biologically significant difference in estimates of average...

  12. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  13. Generating realistic images using Kray

    Science.gov (United States)

    Tanski, Grzegorz

    2004-07-01

    Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.

  14. FINANCIAL BEHAVIOR OF POPULATION ON LOAN FOR HOUSING - A STATISTICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Ana-Gabriela Babucea

    2009-11-01

    Full Text Available The economic crisis has stopped the Romanians consumerism momentum. The perspective of job loss, changes in exchange rate or the inflation made many people to be extremely cautious when contract a loan for fear they can not repay it. In this paper we intend a retrospective analysis of housing credit growth to the population and build a multifactorial regression model with regard to factors influence the economic crisis.

  15. Surface Area of Patellar Facets: Inferential Statistics in the Iraqi Population

    Directory of Open Access Journals (Sweden)

    Ahmed Al-Imam

    2017-01-01

    Full Text Available Background. The patella is the largest sesamoid bone in the body; its three-dimensional complexity necessitates biomechanical perfection. Numerous pathologies occur at the patellofemoral unit which may end in degenerative changes. This study aims to test the presence of statistical correlation between the surface areas of patellar facets and other patellar morphometric parameters. Materials and Methods. Forty dry human patellae were studied. The morphometry of each patella was measured using a digital Vernier Caliper, electronic balance, and image analyses software known as ImageJ. The patellar facetal surface area was correlated with patellar weight, height, width, and thickness. Results. Inferential statistics proved the existence of linear correlation of total facetal surface area and patellar weight, height, width, and thickness. The correlation was strongest for surface area versus patellar weight. The lateral facetal area was found persistently larger than the medial facetal area, the p value was found to be <0.001 (one-tailed t-test for right patellae, and another significant p value of < 0.001 (one-tailed t-test was found for left patellae. Conclusion. These data are vital for the restoration of the normal biomechanics of the patellofemoral unit; these are to be consulted during knee surgeries and implant designs and can be of an indispensable anthropometric, interethnic, and biometric value.

  16. The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.

    Directory of Open Access Journals (Sweden)

    Andrzej Tukiendorf

    Full Text Available In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S, whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P < 0.0001. The statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS, and diabetes mellitus (DM status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.

  17. The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.

    Science.gov (United States)

    Tukiendorf, Andrzej; Kaźmierski, Radosław; Michalak, Sławomir

    2013-01-01

    In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S), whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M) algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS), and diabetes mellitus (DM) status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED) and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.

  18. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  19. Identifying Copy Number Variants under Selection in Geographically Structured Populations Based on -statistics

    Directory of Open Access Journals (Sweden)

    Hae-Hiang Song

    2012-06-01

    Full Text Available Large-scale copy number variants (CNVs in the human provide the raw material for delineating population differences, as natural selection may have affected at least some of the CNVs thus far discovered. Although the examination of relatively large numbers of specific ethnic groups has recently started in regard to inter-ethnic group differences in CNVs, identifying and understanding particular instances of natural selection have not been performed. The traditional FST measure, obtained from differences in allele frequencies between populations, has been used to identify CNVs loci subject to geographically varying selection. Here, we review advances and the application of multinomial-Dirichlet likelihood methods of inference for identifying genome regions that have been subject to natural selection with the FST estimates. The contents of presentation are not new; however, this review clarifies how the application of the methods to CNV data, which remains largely unexplored, is possible. A hierarchical Bayesian method, which is implemented via Markov Chain Monte Carlo, estimates locus-specific FST and can identify outlying CNVs loci with large values of FST. By applying this Bayesian method to the publicly available CNV data, we identified the CNV loci that show signals of natural selection, which may elucidate the genetic basis of human disease and diversity.

  20. Population dynamics at high Reynolds number

    NARCIS (Netherlands)

    Perlekar, P.; Benzi, R.; Nelson, D.R.; Toschi, F.

    2010-01-01

    We study the statistical properties of population dynamics evolving in a realistic two-dimensional compressible turbulent velocity field. We show that the interplay between turbulent dynamics and population growth and saturation leads to quasi-localization and a remarkable reduction in the carrying

  1. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  2. The unauthorized Mexican immigrant population and welfare in Los Angeles County: a comparative statistical analysis.

    Science.gov (United States)

    Marcelli, E A; Heer, D M

    1998-01-01

    "Using a unique 1994 Los Angeles County Household Survey of foreign-born Mexicans and the March 1994 and 1995 Current Population Surveys, we estimate the number of unauthorized Mexican immigrants (UMIs) residing in Los Angeles County, and compare their use of seven welfare programs with that of other non-U.S. citizens and U.S. citizens. Non-U.S. citizens were found to be no more likely than U.S. citizens to have used welfare, and UMIs were 11% (14%) less likely than other non-citizens (U.S.-born citizens).... We demonstrate how results differ depending on the unit of analysis employed, and on which programs constitute ¿welfare'." excerpt

  3. A statistical dynamics approach to the study of human health data: Resolving population scale diurnal variation in laboratory data

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2010-01-01

    Statistical physics and information theory is applied to the clinical chemistry measurements present in a patient database containing 2.5 million patients' data over a 20-year period. Despite the seemingly naive approach of aggregating all patients over all times (with respect to particular clinical chemistry measurements), both a diurnal signal in the decay of the time-delayed mutual information and the presence of two sub-populations with differing health are detected. This provides a proof in principle that the highly fragmented data in electronic health records has potential for being useful in defining disease and human phenotypes.

  4. Statistical Methods for Population Genetic Inference Based on Low-Depth Sequencing Data from Modern and Ancient DNA

    DEFF Research Database (Denmark)

    Korneliussen, Thorfinn Sand

    Due to the recent advances in DNA sequencing technology genomic data are being generated at an unprecedented rate and we are gaining access to entire genomes at population level. The technology does, however, not give direct access to the genetic variation and the many levels of preprocessing...... that is required before being able to make inferences from the data introduces multiple levels of uncertainty, especially for low-depth data. Therefore methods that take into account the inherent uncertainty are needed for being able to make robust inferences in the downstream analysis of such data. This poses...... a problem for a range of key summary statistics within populations genetics where existing methods are based on the assumption that the true genotypes are known. Motivated by this I present: 1) a new method for the estimation of relatedness between pairs of individuals, 2) a new method for estimating...

  5. Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example

    Directory of Open Access Journals (Sweden)

    Gmel Gerrit

    2010-03-01

    Full Text Available Abstract Background Alcohol consumption is a major risk factor in the global burden of disease, with overall volume of exposure as the principal underlying dimension. Two main sources of data on volume of alcohol exposure are available: surveys and per capita consumption derived from routine statistics such as taxation. As both sources have significant problems, this paper presents an approach that triangulates information from both sources into disaggregated estimates in line with the overall level of per capita consumption. Methods A modeling approach was applied to the US using data from a large and representative survey, the National Epidemiologic Survey on Alcohol and Related Conditions. Different distributions (log-normal, gamma, Weibull were used to model consumption among drinkers in subgroups defined by sex, age, and ethnicity. The gamma distribution was used to shift the fitted distributions in line with the overall volume as derived from per capita estimates. Implications for alcohol-attributable fractions were presented, using liver cirrhosis as an example. Results The triangulation of survey data with aggregated per capita consumption data proved feasible and allowed for modeling of alcohol exposure disaggregated by sex, age, and ethnicity. These models can be used in combination with risk relations for burden of disease calculations. Sensitivity analyses showed that the gamma distribution chosen yielded very similar results in terms of fit and alcohol-attributable mortality as the other tested distributions. Conclusions Modeling alcohol consumption via the gamma distribution was feasible. To further refine this approach, research should focus on the main assumptions underlying the approach to explore differences between volume estimates derived from surveys and per capita consumption figures.

  6. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  7. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  8. Full dose reduction potential of statistical iterative reconstruction for head CT protocols in a predominantly pediatric population

    Science.gov (United States)

    Mirro, Amy E.; Brady, Samuel L.; Kaufman, Robert. A.

    2016-01-01

    Purpose To implement the maximum level of statistical iterative reconstruction that can be used to establish dose-reduced head CT protocols in a primarily pediatric population. Methods Select head examinations (brain, orbits, sinus, maxilla and temporal bones) were investigated. Dose-reduced head protocols using an adaptive statistical iterative reconstruction (ASiR) were compared for image quality with the original filtered back projection (FBP) reconstructed protocols in phantom using the following metrics: image noise frequency (change in perceived appearance of noise texture), image noise magnitude, contrast-to-noise ratio (CNR), and spatial resolution. Dose reduction estimates were based on computed tomography dose index (CTDIvol) values. Patient CTDIvol and image noise magnitude were assessed in 737 pre and post dose reduced examinations. Results Image noise texture was acceptable up to 60% ASiR for Soft reconstruction kernel (at both 100 and 120 kVp), and up to 40% ASiR for Standard reconstruction kernel. Implementation of 40% and 60% ASiR led to an average reduction in CTDIvol of 43% for brain, 41% for orbits, 30% maxilla, 43% for sinus, and 42% for temporal bone protocols for patients between 1 month and 26 years, while maintaining an average noise magnitude difference of 0.1% (range: −3% to 5%), improving CNR of low contrast soft tissue targets, and improving spatial resolution of high contrast bony anatomy, as compared to FBP. Conclusion The methodology in this study demonstrates a methodology for maximizing patient dose reduction and maintaining image quality using statistical iterative reconstruction for a primarily pediatric population undergoing head CT examination. PMID:27056425

  9. Planetary populations in the mass-period diagram: A statistical treatment of exoplanet formation and the role of planet traps

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Yasuhiro [Currently EACOA Fellow at Institute of Astronomy and Astrophysics, Academia Sinica (ASIAA), Taipei 10641, Taiwan. (China); Pudritz, Ralph E., E-mail: yasu@asiaa.sinica.edu.tw, E-mail: pudritz@physics.mcmaster.ca [Also at Origins Institute, McMaster University, Hamilton, ON L8S 4M1, Canada. (Canada)

    2013-11-20

    The rapid growth of observed exoplanets has revealed the existence of several distinct planetary populations in the mass-period diagram. Two of the most surprising are (1) the concentration of gas giants around 1 AU and (2) the accumulation of a large number of low-mass planets with tight orbits, also known as super-Earths and hot Neptunes. We have recently shown that protoplanetary disks have multiple planet traps that are characterized by orbital radii in the disks and halt rapid type I planetary migration. By coupling planet traps with the standard core accretion scenario, we showed that one can account for the positions of planets in the mass-period diagram. In this paper, we demonstrate quantitatively that most gas giants formed at planet traps tend to end up around 1 AU, with most of these being contributed by dead zones and ice lines. We also show that a large fraction of super-Earths and hot Neptunes are formed as 'failed' cores of gas giants—this population being constituted by comparable contributions from dead zone and heat transition traps. Our results are based on the evolution of forming planets in an ensemble of disks where we vary only the lifetimes of disks and their mass accretion rates onto the host star. We show that a statistical treatment of the evolution of a large population of planetary cores caught in planet traps accounts for the existence of three distinct exoplanetary populations—the hot Jupiters, the more massive planets around r = 1 AU, and the short-period super-Earths and hot Neptunes. There are very few populations that feed into the large orbital radii characteristic of the imaged Jovian planet, which agrees with recent surveys. Finally, we find that low-mass planets in tight orbits become the dominant planetary population for low-mass stars (M {sub *} ≤ 0.7 M {sub ☉}).

  10. Planetary populations in the mass-period diagram: A statistical treatment of exoplanet formation and the role of planet traps

    International Nuclear Information System (INIS)

    Hasegawa, Yasuhiro; Pudritz, Ralph E.

    2013-01-01

    The rapid growth of observed exoplanets has revealed the existence of several distinct planetary populations in the mass-period diagram. Two of the most surprising are (1) the concentration of gas giants around 1 AU and (2) the accumulation of a large number of low-mass planets with tight orbits, also known as super-Earths and hot Neptunes. We have recently shown that protoplanetary disks have multiple planet traps that are characterized by orbital radii in the disks and halt rapid type I planetary migration. By coupling planet traps with the standard core accretion scenario, we showed that one can account for the positions of planets in the mass-period diagram. In this paper, we demonstrate quantitatively that most gas giants formed at planet traps tend to end up around 1 AU, with most of these being contributed by dead zones and ice lines. We also show that a large fraction of super-Earths and hot Neptunes are formed as 'failed' cores of gas giants—this population being constituted by comparable contributions from dead zone and heat transition traps. Our results are based on the evolution of forming planets in an ensemble of disks where we vary only the lifetimes of disks and their mass accretion rates onto the host star. We show that a statistical treatment of the evolution of a large population of planetary cores caught in planet traps accounts for the existence of three distinct exoplanetary populations—the hot Jupiters, the more massive planets around r = 1 AU, and the short-period super-Earths and hot Neptunes. There are very few populations that feed into the large orbital radii characteristic of the imaged Jovian planet, which agrees with recent surveys. Finally, we find that low-mass planets in tight orbits become the dominant planetary population for low-mass stars (M * ≤ 0.7 M ☉ ).

  11. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  12. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  13. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  14. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  15. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...

  16. [The application of the multidimensional statistical methods in the evaluation of the influence of atmospheric pollution on the population's health].

    Science.gov (United States)

    Surzhikov, V D; Surzhikov, D V

    2014-01-01

    The search and measurement of causal relationships between exposure to air pollution and health state of the population is based on the system analysis and risk assessment to improve the quality of research. With this purpose there is applied the modern statistical analysis with the use of criteria of independence, principal component analysis and discriminate function analysis. As a result of analysis out of all atmospheric pollutants there were separated four main components: for diseases of the circulatory system main principal component is implied with concentrations of suspended solids, nitrogen dioxide, carbon monoxide, hydrogen fluoride, for the respiratory diseases the main c principal component is closely associated with suspended solids, sulfur dioxide and nitrogen dioxide, charcoal black. The discriminant function was shown to be used as a measure of the level of air pollution.

  17. Realistic rhetoric and legal decision

    Directory of Open Access Journals (Sweden)

    João Maurício Adeodato

    2017-06-01

    Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.

  18. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....

  19. Realist cinema as world cinema

    OpenAIRE

    Nagib, Lucia

    2017-01-01

    The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...

  20. Monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm on permanent plots: sampling methods and statistical properties of data

    Science.gov (United States)

    A.R. Mason; H.G. Paul

    1994-01-01

    Procedures for monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm are recommended based on many years experience in sampling these species in eastern Oregon and Washington. It is shown that statistically reliable estimates of larval density can be made for a population by sampling host trees in a series of permanent plots in a...

  1. Generalized Warburg impedance on realistic self-affine fractals ...

    Indian Academy of Sciences (India)

    Administrator

    Generalized Warburg impedance on realistic self-affine fractals: Comparative study of statistically corrugated and isotropic roughness. RAJESH KUMAR and RAMA KANT. Journal of Chemical Sciences, Vol. 121, No. 5, September 2009, pp. 579–588. 1. ( ) c. L. R ω on page 582, column 2, para 2, after eq (8) should read as ...

  2. Mapping cell populations in flow cytometry data for cross‐sample comparison using the Friedman–Rafsky test statistic as a distance measure

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu

    2015-01-01

    Abstract Flow cytometry (FCM) is a fluorescence‐based single‐cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap‐FR, a novel method for cell population mapping across FCM samples. FlowMap‐FR is based on the Friedman–Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap‐FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap‐FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap‐FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap‐FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap‐FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback–Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL‐distance in distinguishing

  3. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell

  4. Predicting foraging wading bird populations in Everglades National Park from seasonal hydrologic statistics under different management scenarios

    Science.gov (United States)

    Kwon, Hyun-Han; Lall, Upmanu; Engel, Vic

    2011-09-01

    The ability to map relationships between ecological outcomes and hydrologic conditions in the Everglades National Park (ENP) is a key building block for their restoration program, a primary goal of which is to improve conditions for wading birds. This paper presents a model linking wading bird foraging numbers to hydrologic conditions in the ENP. Seasonal hydrologic statistics derived from a single water level recorder are well correlated with water depths throughout most areas of the ENP, and are effective as predictors of wading bird numbers when using a nonlinear hierarchical Bayesian model to estimate the conditional distribution of bird populations. Model parameters are estimated using a Markov chain Monte Carlo (MCMC) procedure. Parameter and model uncertainty is assessed as a byproduct of the estimation process. Water depths at the beginning of the nesting season, the average dry season water level, and the numbers of reversals from the dry season recession are identified as significant predictors, consistent with the hydrologic conditions considered important in the production and concentration of prey organisms in this system. Long-term hydrologic records at the index location allow for a retrospective analysis (1952-2006) of foraging bird numbers showing low frequency oscillations in response to decadal fluctuations in hydroclimatic conditions. Simulations of water levels at the index location used in the Bayesian model under alternative water management scenarios allow the posterior probability distributions of the number of foraging birds to be compared, thus providing a mechanism for linking management schemes to seasonal rainfall forecasts.

  5. Tuukka Kaidesoja on Critical Realist Transcendental Realism

    Directory of Open Access Journals (Sweden)

    Groff Ruth

    2015-09-01

    Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.

  6. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Directory of Open Access Journals (Sweden)

    Shi-Yi Chen

    Full Text Available Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i genetic diversity of DNA sequences, (ii statistical tests for neutral evolution, and (iii measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  7. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Science.gov (United States)

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  8. Survey of Approaches to Generate Realistic Synthetic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.

  9. Realistic Simulation of Rice Plant

    Directory of Open Access Journals (Sweden)

    Wei-long DING

    2011-09-01

    Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.

  10. Estadísticos poblacionales de Triatoma rubrovaria en condiciones de laboratorio Population statistics of Triatoma rubrovaria in laboratory

    Directory of Open Access Journals (Sweden)

    Elena B Oscherov

    2005-04-01

    monitored weekly and kept under controlled temperature (28±3°C and relative humidity (63±10%. A life table was constructed and other vital statistics were calculated and recorded. RESULTS: Higher mortality was recorded in the first through the fourth nymphal stadium. A constant decrease was seen from the fifth nymphal instar. Life expectancy dropped linearly after overcoming the critical stages. Adults mean survival was 50.2 weeks. The first oviposition was after 40.6 weeks. The fecundity was 859.6 eggs with an average 22.8 eggs per female. The reproductive period was 37.7 weeks. The generation time was 55.3 weeks and the net reproduction rate was 133.7. The intrinsic rate of weekly increment was 0.088. In a stable age distribution the population would be composed of 25.3% eggs, 72.3% nymphs and 2.4% adults. Adults accounted for more than 70% of the total reproductive value. CONCLUSIONS: Triatoma rubrovaria had a long survival as imago, a late first reproduction and a low intrinsic rate of natural increase.

  11. Exophobic Quasi-Realistic Heterotic String Vacua

    CERN Document Server

    Assel, Benjamin; Faraggi, Alon E; Kounnas, Costas; Rizos, John

    2009-01-01

    We demonstrate the existence of heterotic-string vacua that are free of massless exotic fields. The need to break the non-Abelian GUT symmetries in k=1 heterotic-string models by Wilson lines, while preserving the GUT embedding of the weak-hypercharge and the GUT prediction sin^2\\theta_w(M(GUT))=3/8, necessarily implies that the models contain states with fractional electric charge. Such states are severely restricted by observations, and must be confined or sufficiently massive and diluted. We construct the first quasi-realistic heterotic-string models in which the exotic states do not appear in the massless spectrum, and only exist, as they must, in the massive spectrum. The SO(10) GUT symmetry is broken to the Pati-Salam subgroup. Our PS heterotic-string models contain adequate Higgs representations to break the GUT and electroweak symmetry, as well as colour Higgs triplets that can be used for the missing partner mechanism. By statistically sampling the space of Pati-Salam vacua we demonstrate the abundan...

  12. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  13. GIS-Mapping and Statistical Analyses to Identify Climate-Vulnerable Communities and Populations Exposed to Superfund Sites

    Science.gov (United States)

    Climate change-related cumulative health risks are expected to be disproportionately greater for overburdened communities, due to differential proximity and exposures to chemical sources and flood zones. Communities and populations vulnerable to climate change-associated impacts ...

  14. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  15. Progress in realistic LOCA analysis

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Ohkawa, K.

    2004-01-01

    In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for

  16. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  17. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II)

    Science.gov (United States)

    2015-09-30

    Interim PCOD approach. In both of these case studies we relied on expert knowledge to link disturbance to vital rates. In the right whale case study...the Interim Population Consequences of Disturbance ( PCoD ) Approach: Quantifying and Assessing the Effects of UK Offshore Renewable Energy

  18. Statistical properties of superimposed stationary spike trains.

    Science.gov (United States)

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  19. Functional responses of human hunters to their prey - why harvest statistics may not always reflect changes in prey population abundance

    DEFF Research Database (Denmark)

    Kahlert, Johnny Abildgaard; Fox, Anthony David; Heldbjerg, Henning

    pigeon Columba palumbus, coot Fulica atra, grey partridge Perdix perdix, roe deer Capreolus capreolus and brown hare Lepus europaeus in Denmark. If we consider hunting a form of predator-prey interaction, the annual kill can be viewed as a predator functional response to prey population size. Convergence...

  20. Technical Basis Document: A Statistical Basis for Interpreting Urinary Excretion of Plutonium Based on Accelerator Mass Spectrometry (AMS) for Selected Atoll Populations in the Marshall Islands

    International Nuclear Information System (INIS)

    Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G

    2007-01-01

    We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from 239 Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ( 239 Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of 239 Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both

  1. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  2. Interferometric data modelling: issues in realistic data generation

    International Nuclear Information System (INIS)

    Mukherjee, Soma

    2004-01-01

    This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework

  3. Oral cancer statistics in India on the basis of first report of 29 population-based cancer registries

    Science.gov (United States)

    Sharma, Swati; Satyanarayana, L; Asthana, Smitha; Shivalingesh, KK; Goutham, Bala Subramanya; Ramachandra, Sujatha

    2018-01-01

    Objectives: To summarize and provide an overview of age-specific oral cancer incidence reported in 29 population-based cancer registry in India. Materials and Methods: Secondary data on age-adjusted rates (AARs) of incidence of oral cancer and other associated sites for all ages (0–75 years) were collected from the report of the National Cancer Registry Programme 2012–2014 in 29 population-based control registries. Results: Among both males and females, mouth cancer had maximum Age adjusted incidence rates (64.8) in the central zone, while oropharynx cancer had minimum AAR (0) in all regions. Conclusion: Oral cancer incidence increases with age with typical pattern of cancer of associated sites of oral cavity seen in the northeast region. PMID:29731552

  4. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  5. Results of recent calculations using realistic potentials

    International Nuclear Information System (INIS)

    Friar, J.L.

    1987-01-01

    Results of recent calculations for the triton using realistic potentials with strong tensor forces are reviewed, with an emphasis on progress made using the many different calculational schemes. Several test problems are suggested. 49 refs., 5 figs

  6. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  7. Impact of some types of mass gatherings on current suicide risk in an urban population: statistical and negative binominal regression analysis of time series.

    Science.gov (United States)

    Usenko, Vasiliy S; Svirin, Sergey N; Shchekaturov, Yan N; Ponarin, Eduard D

    2014-04-04

    Many studies have investigated the impact of a wide range of social events on suicide-related behaviour. However, these studies have predominantly examined national events. The aim of this study is to provide a statistical evaluation of the relationship between mass gatherings in some relatively small urban sub-populations and the general suicide rates of a major city. The data were gathered in the Ukrainian city of Dnipropetrovsk, with a population of 1 million people, in 2005-2010. Suicide attempts, suicides, and the total amount of suicide-related behaviours were registered daily for each sex. Bivariate and multivariate statistical analysis, including negative binomial regression, were applied to assess the risk of suicide-related behaviour in the city's general population for 7 days before and after 427 mass gatherings, such as concerts, football games, and non-regular mass events organized by the Orthodox Church and new religious movements. The bivariate and multivariate statistical analyses found significant changes in some suicide-related behaviour rates in the city's population after certain kinds of mass gatherings. In particular, we observed an increased relative risk (RR) of male suicide-related behaviour after a home defeat of the local football team (RR = 1.32, p = 0.047; regression coefficient beta = 0.371, p = 0.002), and an increased risk of male suicides (RR = 1.29, p = 0.006; beta =0.255, p = 0.002), male suicide-related behaviour (RR = 1.25, p = 0.019; beta =0.251, p football games and mass events organized by new religious movements involved a relatively small part of an urban population (1.6 and 0.3%, respectively), we observed a significant increase of the some suicide-related behaviour rates in the whole population. It is likely that the observed effect on suicide-related behaviour is related to one's personal presence at the event rather than to its broadcast. Our findings can be explained largely in

  8. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  9. Evolution in Cloud Population Statistics of the MJO: From AMIE Field Observations to Global Cloud-Permiting Models

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Chidong [Univ. of Miami, Coral Gables, FL (United States)

    2016-08-14

    Motivated by the success of the AMIE/DYNAMO field campaign, which collected unprecedented observations of cloud and precipitation from the tropical Indian Ocean in Octber 2011 – March 2012, this project explored how such observations can be applied to assist the development of global cloud-permitting models through evaluating and correcting model biases in cloud statistics. The main accomplishment of this project were made in four categories: generating observational products for model evaluation, using AMIE/DYNAMO observations to validate global model simulations, using AMIE/DYNAMO observations in numerical studies of cloud-permitting models, and providing leadership in the field. Results from this project provide valuable information for building a seamless bridge between DOE ASR program’s component on process level understanding of cloud processes in the tropics and RGCM focus on global variability and regional extremes. In particular, experience gained from this project would be directly applicable to evaluation and improvements of ACME, especially as it transitions to a non-hydrostatic variable resolution model.

  10. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    Science.gov (United States)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  11. Use of remote sensing, geographic information systems, and spatial statistics to assess spatio-temporal population dynamics of Heterodera glycines and soybean yield quantity and quality

    Science.gov (United States)

    Moreira, Antonio Jose De Araujo

    Soybean, Glycine max (L.) Merr., is an important source of oil and protein worldwide, and soybean cyst nematode (SCN), Heterodera glycines, is among the most important yield-limiting factors in soybean production worldwide. Early detection of SCN is difficult because soybean plants infected by SCN often do not exhibit visible symptoms. It was hypothesized, however, that reflectance data obtained by remote sensing from soybean canopies may be used to detect plant stress caused by SCN infection. Moreover, reflectance measurements may be related to soybean growth and yield. Two field experiments were conducted from 2000 to 2002 to study the relationships among reflectance data, quantity and quality of soybean yield, and SCN population densities. The best relationships between reflectance and the quantity of soybean grain yield occurred when reflectance data were obtained late August to early September. Similarly, reflectance was best related to seed oil and seed protein content and seed size when measured during late August/early September. Grain quality-reflectance relationships varied spatially and temporally. Reflectance measured early or late in the season had the best relationships with SCN population densities measured at planting. Soil properties likely affected reflectance measurements obtained at the beginning of the season and somehow may have been related to SCN population densities at planting. Reflectance data obtained at the end of the growing season likely was affected by early senescence of SCN-infected soybeans. Spatio-temporal aspects of SCN population densities in both experiments were assessed using spatial statistics and regression analyses. In the 2000 and 2001 growing seasons, spring-to-fall changes in SCN population densities were best related to SCN population densities at planting for both experiments. However, within-season changes in SCN population densities were best related to SCN population densities at harvest for both experiments in

  12. A Model of Biological Attacks on a Realistic Population

    Science.gov (United States)

    Carley, Kathleen M.; Fridsma, Douglas; Casman, Elizabeth; Altman, Neal; Chen, Li-Chiou; Kaminsky, Boris; Nave, Demian; Yahja, Alex

    The capability to assess the impacts of large-scale biological attacks and the efficacy of containment policies is critical and requires knowledge-intensive reasoning about social response and disease transmission within a complex social system. There is a close linkage among social networks, transportation networks, disease spread, and early detection. Spatial dimensions related to public gathering places such as hospitals, nursing homes, and restaurants, can play a major role in epidemics [Klovdahl et. al. 2001]. Like natural epidemics, bioterrorist attacks unfold within spatially defined, complex social systems, and the societal and networked response can have profound effects on their outcome. This paper focuses on bioterrorist attacks, but the model has been applied to emergent and familiar diseases as well.

  13. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  14. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  15. Russian Subjects on the Territory of the Grand Duchy of Finland (according to the Russian and Finnish Population Statistics of the late 19th century

    Directory of Open Access Journals (Sweden)

    Sergey G. Kashchenko

    2017-03-01

    Full Text Available Historical demographic research plays an important role in multi-disciplinary projects in historical, social and political sciences at the moment. It is of great importance for migration and social policy studies and also concerns economics, material and intellectual culture and inter-ethnic and inter-faith relations. The border regions with high population mobility are of particular interest. The search in the Russian State Historical Archive uncovered a complex of primary census material concerning the Russian population in the Grand Duchy of Finland. Thus it became possible to introduce previously unstudied documents, containing data on the Russian subjects, mainly military men, stationed at Helsinki, Sveaborg, Tavastgus, Torneo and a number of other garrisons into scientific use. Russian military men in those towns comprised quite a noticeable element in the composition of the population. It is no doubt that Russian officers and their family members were part of the town elite according to their social status, life experience, and level of education. Consequently the primary documents of the 1897 census give us a unique opportunity to see the demographic situation of the Russian garrisons accommodated in the Vyborg Governorate in the end of the 19th century from the inside, and add living colors related to biographies of certain people to the dry statistical picture which describes the population of the town.

  16. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    Science.gov (United States)

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  17. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada.

    Science.gov (United States)

    Smylie, Janet; Firestone, Michelle

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.

  18. Visualization and modeling of sub-populations of compositional data: statistical methods illustrated by means of geochemical data from fumarolic fluids

    Science.gov (United States)

    Pawlowsky-Glahn, Vera; Buccianti, Antonella

    In the investigation of fluid samples of a volcanic system, collected during a given period of time, one of the main goals is to discover cause-effect relationships that allow us to explain changes in the chemical composition. They might be caused by physicochemical factors, such as temperature, pressure, or non-conservative behavior of some chemical constituents (addition or subtraction of material), among others. The presence of subgroups of observations showing different behavior is evidence of unusually complex situations, which might render even more difficult the analysis and interpretation of observed phenomena. These cases require appropriate statistical techniques as well as sound a priori hypothesis concerning underlying geological processes. The purpose of this article is to present the state of the art in the methodology for a better visualization of compositional data, as well as for detecting statistically significant sub-populations. The scheme of this article is to present first the application, and then the underlying methodology, with the aim of the first motivating the second. Thus, the first part has the goal to illustrate how to understand and interpret results, whereas the second is devoted to expose how to perform a study of this kind. The case study is related to the chemical composition of a fumarole of Vulcano Island (southern Italy), called F14. The volcanic activity at Vulcano Island is subject to a continuous program of geochemical surveillance from 1978 up to now and the large data set of observations contains the main chemical composition of volcanic gases as well as trace element concentrations in the condensates of fumarolic gases. Out of the complete set of measured components, the variables H2S, HF and As, determined in samples collected from 1978 to 1993 (As is not available in recent samples) are used to characterize two groups in the original population, which proved to be statistically distinct. The choice of the variables is

  19. Iterated interactions method. Realistic NN potential

    International Nuclear Information System (INIS)

    Gorbatov, A.M.; Skopich, V.L.; Kolganova, E.A.

    1991-01-01

    The method of iterated potential is tested in the case of realistic fermionic systems. As a base for comparison calculations of the 16 O system (using various versions of realistic NN potentials) by means of the angular potential-function method as well as operators of pairing correlation were used. The convergence of genealogical series is studied for the central Malfliet-Tjon potential. In addition the mathematical technique of microscopical calculations is improved: new equations for correlators in odd states are suggested and the technique of leading terms was applied for the first time to calculations of heavy p-shell nuclei in the basis of angular potential functions

  20. Are there realistically interpretable local theories?

    International Nuclear Information System (INIS)

    d'Espagnat, B.

    1989-01-01

    Although it rests on strongly established proofs, the statement that no realistically interpretable local theory is compatible with some experimentally testable predictions of quantum mechanics seems at first sight to be incompatible with a few general ideas and clear-cut statements occurring in recent theoretical work by Griffiths, Omnes, and Ballentine and Jarrett. It is shown here that in fact none of the developments due to these authors can be considered as a realistically interpretable local theory, so that there is no valid reason for suspecting that the existing proofs of the statement in question are all flawed

  1. A Radiosity Approach to Realistic Image Synthesis

    Science.gov (United States)

    1992-12-01

    AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation

  2. Falling in the elderly: Do statistical models matter for performance criteria of fall prediction? Results from two large population-based studies.

    Science.gov (United States)

    Kabeshova, Anastasiia; Launay, Cyrille P; Gromov, Vasilii A; Fantino, Bruno; Levinoff, Elise J; Allali, Gilles; Beauchet, Olivier

    2016-01-01

    To compare performance criteria (i.e., sensitivity, specificity, positive predictive value, negative predictive value, area under receiver operating characteristic curve and accuracy) of linear and non-linear statistical models for fall risk in older community-dwellers. Participants were recruited in two large population-based studies, "Prévention des Chutes, Réseau 4" (PCR4, n=1760, cross-sectional design, retrospective collection of falls) and "Prévention des Chutes Personnes Agées" (PCPA, n=1765, cohort design, prospective collection of falls). Six linear statistical models (i.e., logistic regression, discriminant analysis, Bayes network algorithm, decision tree, random forest, boosted trees), three non-linear statistical models corresponding to artificial neural networks (multilayer perceptron, genetic algorithm and neuroevolution of augmenting topologies [NEAT]) and the adaptive neuro fuzzy interference system (ANFIS) were used. Falls ≥1 characterizing fallers and falls ≥2 characterizing recurrent fallers were used as outcomes. Data of studies were analyzed separately and together. NEAT and ANFIS had better performance criteria compared to other models. The highest performance criteria were reported with NEAT when using PCR4 database and falls ≥1, and with both NEAT and ANFIS when pooling data together and using falls ≥2. However, sensitivity and specificity were unbalanced. Sensitivity was higher than specificity when identifying fallers, whereas the converse was found when predicting recurrent fallers. Our results showed that NEAT and ANFIS were non-linear statistical models with the best performance criteria for the prediction of falls but their sensitivity and specificity were unbalanced, underscoring that models should be used respectively for the screening of fallers and the diagnosis of recurrent fallers. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  3. MetAssimulo:Simulation of Realistic NMR Metabolic Profiles

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2010-10-01

    Full Text Available Abstract Background Probing the complex fusion of genetic and environmental interactions, metabolic profiling (or metabolomics/metabonomics, the study of small molecules involved in metabolic reactions, is a rapidly expanding 'omics' field. A major technique for capturing metabolite data is 1H-NMR spectroscopy and this yields highly complex profiles that require sophisticated statistical analysis methods. However, experimental data is difficult to control and expensive to obtain. Thus data simulation is a productive route to aid algorithm development. Results MetAssimulo is a MATLAB-based package that has been developed to simulate 1H-NMR spectra of complex mixtures such as metabolic profiles. Drawing data from a metabolite standard spectral database in conjunction with concentration information input by the user or constructed automatically from the Human Metabolome Database, MetAssimulo is able to create realistic metabolic profiles containing large numbers of metabolites with a range of user-defined properties. Current features include the simulation of two groups ('case' and 'control' specified by means and standard deviations of concentrations for each metabolite. The software enables addition of spectral noise with a realistic autocorrelation structure at user controllable levels. A crucial feature of the algorithm is its ability to simulate both intra- and inter-metabolite correlations, the analysis of which is fundamental to many techniques in the field. Further, MetAssimulo is able to simulate shifts in NMR peak positions that result from matrix effects such as pH differences which are often observed in metabolic NMR spectra and pose serious challenges for statistical algorithms. Conclusions No other software is currently able to simulate NMR metabolic profiles with such complexity and flexibility. This paper describes the algorithm behind MetAssimulo and demonstrates how it can be used to simulate realistic NMR metabolic profiles with

  4. On Realistically Attacking Tor with Website Fingerprinting

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2016-10-01

    Full Text Available Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.

  5. Satellite Maps Deliver More Realistic Gaming

    Science.gov (United States)

    2013-01-01

    When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.

  6. Realistic searches on stretched exponential networks

    Indian Academy of Sciences (India)

    We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed.

  7. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap; Bae, Sangwon; Knauer, Christian; Lee, Mira; Shin, Chansu; Vigneron, Antoine E.

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing

  8. Estimating the population distribution of usual 24-hour sodium excretion from timed urine void specimens using a statistical approach accounting for correlated measurement errors.

    Science.gov (United States)

    Wang, Chia-Yih; Carriquiry, Alicia L; Chen, Te-Ching; Loria, Catherine M; Pfeiffer, Christine M; Liu, Kiang; Sempos, Christopher T; Perrine, Cria G; Cogswell, Mary E

    2015-05-01

    High US sodium intake and national reduction efforts necessitate developing a feasible and valid monitoring method across the distribution of low-to-high sodium intake. We examined a statistical approach using timed urine voids to estimate the population distribution of usual 24-h sodium excretion. A sample of 407 adults, aged 18-39 y (54% female, 48% black), collected each void in a separate container for 24 h; 133 repeated the procedure 4-11 d later. Four timed voids (morning, afternoon, evening, overnight) were selected from each 24-h collection. We developed gender-specific equations to calibrate total sodium excreted in each of the one-void (e.g., morning) and combined two-void (e.g., morning + afternoon) urines to 24-h sodium excretion. The calibrated sodium excretions were used to estimate the population distribution of usual 24-h sodium excretion. Participants were then randomly assigned to modeling (n = 160) or validation (n = 247) groups to examine the bias in estimated population percentiles. Median bias in predicting selected percentiles (5th, 25th, 50th, 75th, 95th) of usual 24-h sodium excretion with one-void urines ranged from -367 to 284 mg (-7.7 to 12.2% of the observed usual excretions) for men and -604 to 486 mg (-14.6 to 23.7%) for women, and with two-void urines from -338 to 263 mg (-6.9 to 10.4%) and -166 to 153 mg (-4.1 to 8.1%), respectively. Four of the 6 two-void urine combinations produced no significant bias in predicting selected percentiles. Our approach to estimate the population usual 24-h sodium excretion, which uses calibrated timed-void sodium to account for day-to-day variation and covariance between measurement errors, produced percentile estimates with relatively low biases across low-to-high sodium excretions. This may provide a low-burden, low-cost alternative to 24-h collections in monitoring population sodium intake among healthy young adults and merits further investigation in other population subgroups. © 2015 American

  9. Separable expansion for realistic multichannel scattering problems

    International Nuclear Information System (INIS)

    Canton, L.; Cattapan, G.; Pisent, G.

    1987-01-01

    A new approach to the multichannel scattering problem with realistic local or nonlocal interactions is developed. By employing the negative-energy solutions of uncoupled Sturmian eigenvalue problems referring to simple auxiliary potentials, the coupling interactions appearing to the original multichannel problem are approximated by finite-rank potentials. By resorting to integral-equation tecniques the coupled-channel equations are then reduced to linear algebraic equations which can be straightforwardly solved. Compact algebraic expressions for the relevant scattering matrix elements are thus obtained. The convergence of the method is tasted in the single-channel case with realistic optical potentials. Excellent agreement is obtained with a few terms in the separable expansion for both real and absorptive interactions

  10. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...

  11. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  12. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  13. Non realist tendencies in new Turkish cinema

    OpenAIRE

    Can, İclal

    2016-01-01

    http://hdl.handle.net/11693/29111 Thesis (M.S.): Bilkent University, Department of Communication and Design, İhsan Doğramacı Bilkent University, 2016. Includes bibliographical references (leaves 113-123). The realist tendency which had been dominant in cinema became more apparent with Italian neorealism affecting other national cinemas to a large extent. With the changing and developing socio economic and cultural dynamics, realism gradually has stopped being a natural const...

  14. Security of quantum cryptography with realistic sources

    International Nuclear Information System (INIS)

    Lutkenhaus, N.

    1999-01-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  15. Quantum cryptography: towards realization in realistic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Imoto, M; Koashi, M; Shimizu, K [NTT Basic Research Laboratories, 3-1 Morinosato-Wakamiya, Atsugi-shi, Kanagawa 243-01 (Japan); Huttner, B [Universite de Geneve, GAP-optique, 20, Rue de l` Ecole de Medecine CH1211, Geneve 4 (Switzerland)

    1997-05-11

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author) 15 refs., 1 fig., 1 tab.

  16. Security of quantum cryptography with realistic sources

    Energy Technology Data Exchange (ETDEWEB)

    Lutkenhaus, N [Helsinki Institute of Physics, P.O. Box 9, 00014 Helsingin yliopisto (Finland)

    1999-08-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  17. Quantum cryptography: towards realization in realistic conditions

    International Nuclear Information System (INIS)

    Imoto, M.; Koashi, M.; Shimizu, K.; Huttner, B.

    1997-01-01

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author)

  18. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  19. Electron percolation in realistic models of carbon nanotube networks

    International Nuclear Information System (INIS)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-01-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models

  20. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  1. Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books

    Science.gov (United States)

    Kelley, Jane E.; Darragh, Janine J.

    2011-01-01

    Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…

  2. Detection and statistics of gusts

    DEFF Research Database (Denmark)

    Hannesdóttir, Ásta; Kelly, Mark C.; Mann, Jakob

    In this project, a more realistic representation of gusts, based on statistical analysis, will account for the variability observed in real-world gusts. The gust representation will focus on temporal, spatial, and velocity scales that are relevant for modern wind turbines and which possibly affect...

  3. A statistical mechanical theory of proton transport kinetics in hydrogen-bonded networks based on population correlation functions with applications to acids and bases.

    Science.gov (United States)

    Tuckerman, Mark E; Chandra, Amalendu; Marx, Dominik

    2010-09-28

    Extraction of relaxation times, lifetimes, and rates associated with the transport of topological charge defects in hydrogen-bonded networks from molecular dynamics simulations is a challenge because proton transfer reactions continually change the identity of the defect core. In this paper, we present a statistical mechanical theory that allows these quantities to be computed in an unbiased manner. The theory employs a set of suitably defined indicator or population functions for locating a defect structure and their associated correlation functions. These functions are then used to develop a chemical master equation framework from which the rates and lifetimes can be determined. Furthermore, we develop an integral equation formalism for connecting various types of population correlation functions and derive an iterative solution to the equation, which is given a graphical interpretation. The chemical master equation framework is applied to the problems of both hydronium and hydroxide transport in bulk water. For each case it is shown that the theory establishes direct links between the defect's dominant solvation structures, the kinetics of charge transfer, and the mechanism of structural diffusion. A detailed analysis is presented for aqueous hydroxide, examining both reorientational time scales and relaxation of the rotational anisotropy, which is correlated with recent experimental results for these quantities. Finally, for OH(-)(aq) it is demonstrated that the "dynamical hypercoordination mechanism" is consistent with available experimental data while other mechanistic proposals are shown to fail. As a means of going beyond the linear rate theory valid from short up to intermediate time scales, a fractional kinetic model is introduced in the Appendix in order to describe the nonexponential long-time behavior of time-correlation functions. Within the mathematical framework of fractional calculus the power law decay ∼t(-σ), where σ is a parameter of the

  4. Realistic and efficient 2D crack simulation

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  5. Generating realistic environments for cyber operations development, testing, and training

    Science.gov (United States)

    Berk, Vincent H.; Gregorio-de Souza, Ian; Murphy, John P.

    2012-06-01

    Training eective cyber operatives requires realistic network environments that incorporate the structural and social complexities representative of the real world. Network trac generators facilitate repeatable experiments for the development, training and testing of cyber operations. However, current network trac generators, ranging from simple load testers to complex frameworks, fail to capture the realism inherent in actual environments. In order to improve the realism of network trac generated by these systems, it is necessary to quantitatively measure the level of realism in generated trac with respect to the environment being mimicked. We categorize realism measures into statistical, content, and behavioral measurements, and propose various metrics that can be applied at each level to indicate how eectively the generated trac mimics the real world.

  6. Towards an agential realist concept of learning

    DEFF Research Database (Denmark)

    Plauborg, Helle

    2018-01-01

    Drawing on agential realism, this article explores how learning can be understood. An agential realist way of thinking about learning is sensitive to the complexity that characterises learning as a phenomenon. Thus, learning is seen as a dynamic and emergent phenomenon, constantly undergoing...... processes of becoming and expanding the range of components involved in such constitutive processes. With inspiration from Barad’s theorisation of spatiality, temporality and the interdependence of discourse and materiality, this article focuses on timespacemattering and material-discursivity. Concepts...

  7. MANAJEMEN LABA: PERILAKU MANAJEMEN OPPORTUNISTIC ATAU REALISTIC ?

    Directory of Open Access Journals (Sweden)

    I Nyoman Wijana Asmara Putra

    2011-01-01

    Full Text Available Earnings management is a still attractive issue. It is often associatedwith negative behavior conducted by management for its own interest. In fact,it also has different side to be examined. There is another motivation to do so,such as to improve the company’s operation. This literature study aims toreview management motivation of doing earnings management, whetheropportunistic or realistic. What conflict that earnings management brings,what pro and cons about it, what would happen if earnings is not managed,whether the company would be better off or worse off.

  8. Scaling up complex interventions: insights from a realist synthesis.

    Science.gov (United States)

    Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan

    2016-12-19

    Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant

  9. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  10. Realistic Affective Forecasting: The Role of Personality

    Science.gov (United States)

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-01-01

    Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463

  11. Urban renewal, gentrification and health equity: a realist perspective.

    Science.gov (United States)

    Mehdipanah, Roshanak; Marra, Giulia; Melis, Giulia; Gelormino, Elena

    2018-04-01

    Up to now, research has focused on the effects of urban renewal programs and their impacts on health. While some of this research points to potential negative health effects due to gentrification, evidence that addresses the complexity associated with this relation is much needed. This paper seeks to better understand when, why and how health inequities arise from urban renewal interventions resulting in gentrification. A realist review, a qualitative systematic review method, aimed to better explain the relation between context, mechanism and outcomes, was used. A literature search was done to identify theoretical models of how urban renewal programs can result in gentrification, which in turn could have negative impacts on health. A systematic approach was then used to identify peer-reviewed studies that provided evidence to support or refute the initial assumptions. Urban renewal programs that resulted in gentrification tended to have negative health effects primarily in residents that were low-income. Urban renewal policies that were inclusive of populations that are vulnerable, from the beginning were less likely to result in gentrification and more likely to positively impact health through physical and social improvements. Research has shown urban renewal policies have significant impacts on populations that are vulnerable and those that result in gentrification can result in negative health consequences for this population. A better understanding of this is needed to impact future policies and advocate for a community-participatory model that includes such populations in the early planning stages.

  12. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  13. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana; Ait Abderrahmane, Hamid; Upadhyay, Ranjit Kumar; Kumari, Nitu

    2013-01-01

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  14. Resolving conflict realistically in today's health care environment.

    Science.gov (United States)

    Smith, S B; Tutor, R S; Phillips, M L

    2001-11-01

    Conflict is a natural part of human interaction, and when properly addressed, results in improved interpersonal relationships and positive organizational culture. Unchecked conflict may escalate to verbal and physical violence. Conflict that is unresolved creates barriers for people, teams, organizational growth, and productivity, leading to cultural disintegration within the establishment. By relying on interdependence and professional collaboration, all parties involved grow and, in turn, benefit the organization and population served. When used in a constructive manner, conflict resolution can help all parties involved see the whole picture, thus allowing freedom for growth and change. Conflict resolution is accomplished best when emotions are controlled before entering into negotiation. Positive confrontation, problem solving, and negotiation are processes used to realistically resolve conflict. Everyone walks away a winner when conflict is resolved in a positive, professional manner (Stone, 1999).

  15. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana

    2013-05-19

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  16. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  17. Realistic Simulations of Coronagraphic Observations with WFIRST

    Science.gov (United States)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  18. Operator representation for effective realistic interactions

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Dennis; Feldmeier, Hans; Neff, Thomas [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany)

    2013-07-01

    We present a method to derive an operator representation from the partial wave matrix elements of effective realistic nucleon-nucleon potentials. This method allows to employ modern effective interactions, which are mostly given in matrix element representation, also in nuclear many-body methods requiring explicitly the operator representation, for example ''Fermionic Molecular Dynamics'' (FMD). We present results for the operator representation of effective interactions obtained from the Argonne V18 potential with the Uenitary Correlation Operator Method'' (UCOM) and the ''Similarity Renormalization Group'' (SRG). Moreover, the operator representation allows a better insight in the nonlocal structure of the potential: While the UCOM transformed potential only shows a quadratic momentum dependence, the momentum dependence of SRG transformed potentials is beyond such a simple polynomial form.

  19. Level density from realistic nuclear potentials

    International Nuclear Information System (INIS)

    Calboreanu, A.

    2006-01-01

    Nuclear level density of some nuclei is calculated using a realistic set of single particle states (sps). These states are derived from the parameterization of nuclear potentials that describe the observed sps over a large number of nuclei. This approach has the advantage that one can infer level density for nuclei that are inaccessible for a direct study, but are very important in astrophysical processes such as those close to the drip lines. Level densities at high excitation energies are very sensitive to the actual set of sps. The fact that the sps spectrum is finite has extraordinary consequences upon nuclear reaction yields due to the leveling-off of the level density at extremely high excitation energies wrongly attributed so far to other nuclear effects. Single-particle level density parameter a parameter is extracted by fitting the calculated densities to the standard Bethe formula

  20. Realistic microscopic level densities for spherical nuclei

    International Nuclear Information System (INIS)

    Cerf, N.

    1994-01-01

    Nuclear level densities play an important role in nuclear reactions such as the formation of the compound nucleus. We develop a microscopic calculation of the level density based on a combinatorial evaluation from a realistic single-particle level scheme. This calculation makes use of a fast Monte Carlo algorithm allowing us to consider large shell model spaces which could not be treated previously in combinatorial approaches. Since our model relies on a microscopic basis, it can be applied to exotic nuclei with more confidence than the commonly used semiphenomenological formuals. An exhaustive comparison of our predicted neutron s-wave resonance spacings with experimental data for a wide range of nuclei is presented

  1. HELIOSEISMOLOGY OF A REALISTIC MAGNETOCONVECTIVE SUNSPOT SIMULATION

    International Nuclear Information System (INIS)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L. Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  2. Incidence, treatment and recurrence of endometriosis in a UK-based population analysis using data from The Health Improvement Network and the Hospital Episode Statistics database.

    Science.gov (United States)

    Cea Soriano, Lucia; López-Garcia, Esther; Schulze-Rath, Renate; Garcia Rodríguez, Luis A

    2017-10-01

    This retrospective study used medical records from The Health Improvement Network (THIN) and Hospital Episode Statistics (HES) database to evaluate endometriosis (incidence, treatment and need for recurrent invasive procedures) in the general UK population. Women aged 12-54 years between January 2000 and December 2010, with a Read code for endometriosis, were identified in THIN. Cases were validated by manual review of free-text comments in medical records and responses to physician questionnaires. False-negative cases were identified among women with Read codes for hysterectomy or dysmenorrhea. Prescriptions of medical therapies for endometriosis were identified in THIN. Cases of single and recurrent invasive procedures were identified in women with medical records in both THIN and HES. Overall, 5087 women had a Read code for endometriosis, corresponding to an incidence of 1.02 (95% confidence interval [CI]: 0.99-1.05) per 1000 person-years. After case validation, the estimate was 1.46 (95% CI: 1.43-1.50) per 1000 person-years. Medical therapy was prescribed to 55.5% of women with endometriosis in the first year after diagnosis. In total, 48.3% of women received invasive treatment during the study period; approximately one-fifth of these women required further invasive treatment, mainly in the 3 years after the index procedure. Using Read codes as the only method to identify women with endometriosis underestimates incidence. Over half of women with recorded endometriosis are prescribed medical therapy in the first year after diagnosis. Women with diagnosed endometriosis are at risk of requiring recurrent invasive procedures.

  3. Realistic tissue visualization using photoacoustic image

    Science.gov (United States)

    Cho, Seonghee; Managuli, Ravi; Jeon, Seungwan; Kim, Jeesu; Kim, Chulhong

    2018-02-01

    Visualization methods are very important in biomedical imaging. As a technology that understands life, biomedical imaging has the unique advantage of providing the most intuitive information in the image. This advantage of biomedical imaging can be greatly improved by choosing a special visualization method. This is more complicated in volumetric data. Volume data has the advantage of containing 3D spatial information. Unfortunately, the data itself cannot directly represent the potential value. Because images are always displayed in 2D space, visualization is the key and creates the real value of volume data. However, image processing of 3D data requires complicated algorithms for visualization and high computational burden. Therefore, specialized algorithms and computing optimization are important issues in volume data. Photoacoustic-imaging is a unique imaging modality that can visualize the optical properties of deep tissue. Because the color of the organism is mainly determined by its light absorbing component, photoacoustic data can provide color information of tissue, which is closer to real tissue color. In this research, we developed realistic tissue visualization using acoustic-resolution photoacoustic volume data. To achieve realistic visualization, we designed specialized color transfer function, which depends on the depth of the tissue from the skin. We used direct ray casting method and processed color during computing shader parameter. In the rendering results, we succeeded in obtaining similar texture results from photoacoustic data. The surface reflected rays were visualized in white, and the reflected color from the deep tissue was visualized red like skin tissue. We also implemented the CUDA algorithm in an OpenGL environment for real-time interactive imaging.

  4. A Realistic Seizure Prediction Study Based on Multiclass SVM.

    Science.gov (United States)

    Direito, Bruno; Teixeira, César A; Sales, Francisco; Castelo-Branco, Miguel; Dourado, António

    2017-05-01

    A patient-specific algorithm, for epileptic seizure prediction, based on multiclass support-vector machines (SVM) and using multi-channel high-dimensional feature sets, is presented. The feature sets, combined with multiclass classification and post-processing schemes aim at the generation of alarms and reduced influence of false positives. This study considers 216 patients from the European Epilepsy Database, and includes 185 patients with scalp EEG recordings and 31 with intracranial data. The strategy was tested over a total of 16,729.80[Formula: see text]h of inter-ictal data, including 1206 seizures. We found an overall sensitivity of 38.47% and a false positive rate per hour of 0.20. The performance of the method achieved statistical significance in 24 patients (11% of the patients). Despite the encouraging results previously reported in specific datasets, the prospective demonstration on long-term EEG recording has been limited. Our study presents a prospective analysis of a large heterogeneous, multicentric dataset. The statistical framework based on conservative assumptions, reflects a realistic approach compared to constrained datasets, and/or in-sample evaluations. The improvement of these results, with the definition of an appropriate set of features able to improve the distinction between the pre-ictal and nonpre-ictal states, hence minimizing the effect of confounding variables, remains a key aspect.

  5. Realistic generation cost of solar photovoltaic electricity

    International Nuclear Information System (INIS)

    Singh, Parm Pal; Singh, Sukhmeet

    2010-01-01

    Solar photovoltaic (SPV) power plants have long working life with zero fuel cost and negligible maintenance cost but requires huge initial investment. The generation cost of the solar electricity is mainly the cost of financing the initial investment. Therefore, the generation cost of solar electricity in different years depends on the method of returning the loan. Currently levelized cost based on equated payment loan is being used. The static levelized generation cost of solar electricity is compared with the current value of variable generation cost of grid electricity. This improper cost comparison is inhibiting the growth of SPV electricity by creating wrong perception that solar electricity is very expensive. In this paper a new method of loan repayment has been developed resulting in generation cost of SPV electricity that increases with time like that of grid electricity. A generalized capital recovery factor has been developed for graduated payment loan in which capital and interest payment in each installment are calculated by treating each loan installment as an independent loan for the relevant years. Generalized results have been calculated which can be used to determine the cost of SPV electricity for a given system at different places. Results show that for SPV system with specific initial investment of 5.00 cents /kWh/year, loan period of 30 years and loan interest rate of 4% the levelized generation cost of SPV electricity with equated payment loan turns out to be 28.92 cents /kWh, while the corresponding generation cost with graduated payment loan with escalation in annual installment of 8% varies from 9.51 cents /kWh in base year to 88.63 cents /kWh in 30th year. So, in this case, the realistic current generation cost of SPV electricity is 9.51 cents /kWh and not 28.92 cents /kWh. Further, with graduated payment loan, extension in loan period results in sharp decline in cost of SPV electricity in base year. Hence, a policy change is required

  6. Determination of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Dietrich, Daniel L.; Ruff, Gary A.; Urban, David

    2013-01-01

    This paper expands on previous work that examined how large a fire a crew member could successfully survive and extinguish in the confines of a spacecraft. The hazards to the crew and equipment during an accidental fire include excessive pressure rise resulting in a catastrophic rupture of the vehicle skin, excessive temperatures that burn or incapacitate the crew (due to hyperthermia), carbon dioxide build-up or accumulation of other combustion products (e.g. carbon monoxide). The previous work introduced a simplified model that treated the fire primarily as a source of heat and combustion products and sink for oxygen prescribed (input to the model) based on terrestrial standards. The model further treated the spacecraft as a closed system with no capability to vent to the vacuum of space. The model in the present work extends this analysis to more realistically treat the pressure relief system(s) of the spacecraft, include more combustion products (e.g. HF) in the analysis and attempt to predict the fire spread and limiting fire size (based on knowledge of terrestrial fires and the known characteristics of microgravity fires) rather than prescribe them in the analysis. Including the characteristics of vehicle pressure relief systems has a dramatic mitigating effect by eliminating vehicle overpressure for all but very large fires and reducing average gas-phase temperatures.

  7. Cerebral blood flow simulations in realistic geometries

    Directory of Open Access Journals (Sweden)

    Szopos Marcela

    2012-04-01

    Full Text Available The aim of this work is to perform the computation of the blood flow in all the cerebral network, obtained from medical images as angiographies. We use free finite elements codes as FreeFEM++. We first test the code on analytical solutions in simplified geometries. Then, we study the influence of boundary conditions on the flow and we finally perform first computations on realistic meshes. L’objectif est ici de simuler l’écoulement sanguin dans tout le réseau cérébral (artériel et veineux obtenu à partir d’angiographies cérébrales 3D à l’aide de logiciels d’éléments finis libres, comme FreeFEM++. Nous menons d’abord une étude détaillée des résultats sur des solutions analytiques et l’influence des conditions limites à imposer dans des géométries simplifiées avant de travailler sur les maillages réalistes.

  8. Challenges and solutions for realistic room simulation

    Science.gov (United States)

    Begault, Durand R.

    2002-05-01

    Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.

  9. Realistic Scheduling Mechanism for Smart Homes

    Directory of Open Access Journals (Sweden)

    Danish Mahmood

    2016-03-01

    Full Text Available In this work, we propose a Realistic Scheduling Mechanism (RSM to reduce user frustration and enhance appliance utility by classifying appliances with respective constraints and their time of use effectively. Algorithms are proposed regarding functioning of home appliances. A 24 hour time slot is divided into four logical sub-time slots, each composed of 360 min or 6 h. In these sub-time slots, only desired appliances (with respect to appliance classification are scheduled to raise appliance utility, restricting power consumption by a dynamically modelled power usage limiter that does not only take the electricity consumer into account but also the electricity supplier. Once appliance, time and power usage limiter modelling is done, we use a nature-inspired heuristic algorithm, Binary Particle Swarm Optimization (BPSO, optimally to form schedules with given constraints representing each sub-time slot. These schedules tend to achieve an equilibrium amongst appliance utility and cost effectiveness. For validation of the proposed RSM, we provide a comparative analysis amongst unscheduled electrical load usage, scheduled directly by BPSO and RSM, reflecting user comfort, which is based upon cost effectiveness and appliance utility.

  10. Comparing Realistic Subthalamic Nucleus Neuron Models

    Science.gov (United States)

    Njap, Felix; Claussen, Jens C.; Moser, Andreas; Hofmann, Ulrich G.

    2011-06-01

    The mechanism of action of clinically effective electrical high frequency stimulation is still under debate. However, recent evidence points at the specific activation of GABA-ergic ion channels. Using a computational approach, we analyze temporal properties of the spike trains emitted by biologically realistic neurons of the subthalamic nucleus (STN) as a function of GABA-ergic synaptic input conductances. Our contribution is based on a model proposed by Rubin and Terman and exhibits a wide variety of different firing patterns, silent, low spiking, moderate spiking and intense spiking activity. We observed that most of the cells in our network turn to silent mode when we increase the GABAA input conductance above the threshold of 3.75 mS/cm2. On the other hand, insignificant changes in firing activity are observed when the input conductance is low or close to zero. We thus reproduce Rubin's model with vanishing synaptic conductances. To quantitatively compare spike trains from the original model with the modified model at different conductance levels, we apply four different (dis)similarity measures between them. We observe that Mahalanobis distance, Victor-Purpura metric, and Interspike Interval distribution are sensitive to different firing regimes, whereas Mutual Information seems undiscriminative for these functional changes.

  11. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  12. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  13. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  14. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    Science.gov (United States)

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Heart Disease and Stroke Statistics

    Science.gov (United States)

    ... Media for Heart.org Heart and Stroke Association Statistics Each year, the American Heart Association, in conjunction ... health and disease in the population. Heart & Stroke Statistics FAQs What is Prevalence? Prevalence is an estimate ...

  16. Realistic respiratory motion margins for external beam partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, Leigh; Quirk, Sarah [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Smith, Wendy L., E-mail: wendy.smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Department of Oncology, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)

    2015-09-15

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  17. Realistic respiratory motion margins for external beam partial breast irradiation

    International Nuclear Information System (INIS)

    Conroy, Leigh; Quirk, Sarah; Smith, Wendy L.

    2015-01-01

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  18. Bayesian inversion using a geologically realistic and discrete model space

    Science.gov (United States)

    Jaeggli, C.; Julien, S.; Renard, P.

    2017-12-01

    Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.

  19. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  20. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  1. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  2. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  3. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  4. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  5. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  6. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  7. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  8. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  9. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  10. Evolutionary approaches for the reverse-engineering of gene regulatory networks: A study on a biologically realistic dataset

    Directory of Open Access Journals (Sweden)

    Gidrol Xavier

    2008-02-01

    Full Text Available Abstract Background Inferring gene regulatory networks from data requires the development of algorithms devoted to structure extraction. When only static data are available, gene interactions may be modelled by a Bayesian Network (BN that represents the presence of direct interactions from regulators to regulees by conditional probability distributions. We used enhanced evolutionary algorithms to stochastically evolve a set of candidate BN structures and found the model that best fits data without prior knowledge. Results We proposed various evolutionary strategies suitable for the task and tested our choices using simulated data drawn from a given bio-realistic network of 35 nodes, the so-called insulin network, which has been used in the literature for benchmarking. We assessed the inferred models against this reference to obtain statistical performance results. We then compared performances of evolutionary algorithms using two kinds of recombination operators that operate at different scales in the graphs. We introduced a niching strategy that reinforces diversity through the population and avoided trapping of the algorithm in one local minimum in the early steps of learning. We show the limited effect of the mutation operator when niching is applied. Finally, we compared our best evolutionary approach with various well known learning algorithms (MCMC, K2, greedy search, TPDA, MMHC devoted to BN structure learning. Conclusion We studied the behaviour of an evolutionary approach enhanced by niching for the learning of gene regulatory networks with BN. We show that this approach outperforms classical structure learning methods in elucidating the original model. These results were obtained for the learning of a bio-realistic network and, more importantly, on various small datasets. This is a suitable approach for learning transcriptional regulatory networks from real datasets without prior knowledge.

  11. Statistical methods for detecting and comparing periodic data and their application to the nycthemeral rhythm of bodily harm: A population based study

    LENUS (Irish Health Repository)

    Stroebel, Armin M

    2010-11-08

    Abstract Background Animals, including humans, exhibit a variety of biological rhythms. This article describes a method for the detection and simultaneous comparison of multiple nycthemeral rhythms. Methods A statistical method for detecting periodic patterns in time-related data via harmonic regression is described. The method is particularly capable of detecting nycthemeral rhythms in medical data. Additionally a method for simultaneously comparing two or more periodic patterns is described, which derives from the analysis of variance (ANOVA). This method statistically confirms or rejects equality of periodic patterns. Mathematical descriptions of the detecting method and the comparing method are displayed. Results Nycthemeral rhythms of incidents of bodily harm in Middle Franconia are analyzed in order to demonstrate both methods. Every day of the week showed a significant nycthemeral rhythm of bodily harm. These seven patterns of the week were compared to each other revealing only two different nycthemeral rhythms, one for Friday and Saturday and one for the other weekdays.

  12. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  13. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  14. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  15. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  16. Realist Stronghold in the Land of Thucydides? - Appraising and Resisting a Realist Tradition in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos Mikelis

    2015-10-01

    Full Text Available Given the integration of the discipline of International Relations in Greece into the global discipline since a few decades, the article addresses the reflection of the ‘realism in and for the globe’ question to this specific case. Although the argument doesn’t go as far as to ‘recover’ forgotten IR theorists or self-proclaimed realists, a geopolitical dimension of socio-economic thought during interwar addressed concerns which could be related to the intricacies of realpolitik. Then again at current times, certain scholars have been eager to maintain a firm stance in favor of realism, focusing on the work of ancient figures, especially Thucydides or Homer, and on questions of the offensive-defensive realism debate as well as on the connection with the English School, while others have offered fruitful insights matching the broad constructivist agenda. Overall, certain genuine arguments have appeared, reflecting diversified views about sovereignty and its function or mitigation.

  17. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  18. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  19. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  20. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  1. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  2. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  3. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  4. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  5. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  6. Biochemical transport modeling, estimation, and detection in realistic environments

    Science.gov (United States)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  7. Predicting Flowering Behavior and Exploring Its Genetic Determinism in an Apple Multi-family Population Based on Statistical Indices and Simplified Phenotyping

    Directory of Open Access Journals (Sweden)

    Jean-Baptiste Durand

    2017-06-01

    Full Text Available Irregular flowering over years is commonly observed in fruit trees. The early prediction of tree behavior is highly desirable in breeding programmes. This study aims at performing such predictions, combining simplified phenotyping and statistics methods. Sequences of vegetative vs. floral annual shoots (AS were observed along axes in trees belonging to five apple related full-sib families. Sequences were analyzed using Markovian and linear mixed models including year and site effects. Indices of flowering irregularity, periodicity and synchronicity were estimated, at tree and axis scales. They were used to predict tree behavior and detect QTL with a Bayesian pedigree-based analysis, using an integrated genetic map containing 6,849 SNPs. The combination of a Biennial Bearing Index (BBI with an autoregressive coefficient (γg efficiently predicted and classified the genotype behaviors, despite few misclassifications. Four QTLs common to BBIs and γg and one for synchronicity were highlighted and revealed the complex genetic architecture of the traits. Irregularity resulted from high AS synchronism, whereas regularity resulted from either asynchronous locally alternating or continual regular AS flowering. A relevant and time-saving method, based on a posteriori sampling of axes and statistical indices is proposed, which is efficient to evaluate the tree breeding values for flowering regularity and could be transferred to other species.

  8. Predicting Flowering Behavior and Exploring Its Genetic Determinism in an Apple Multi-family Population Based on Statistical Indices and Simplified Phenotyping.

    Science.gov (United States)

    Durand, Jean-Baptiste; Allard, Alix; Guitton, Baptiste; van de Weg, Eric; Bink, Marco C A M; Costes, Evelyne

    2017-01-01

    Irregular flowering over years is commonly observed in fruit trees. The early prediction of tree behavior is highly desirable in breeding programmes. This study aims at performing such predictions, combining simplified phenotyping and statistics methods. Sequences of vegetative vs. floral annual shoots (AS) were observed along axes in trees belonging to five apple related full-sib families. Sequences were analyzed using Markovian and linear mixed models including year and site effects. Indices of flowering irregularity, periodicity and synchronicity were estimated, at tree and axis scales. They were used to predict tree behavior and detect QTL with a Bayesian pedigree-based analysis, using an integrated genetic map containing 6,849 SNPs. The combination of a Biennial Bearing Index (BBI) with an autoregressive coefficient (γ g ) efficiently predicted and classified the genotype behaviors, despite few misclassifications. Four QTLs common to BBIs and γ g and one for synchronicity were highlighted and revealed the complex genetic architecture of the traits. Irregularity resulted from high AS synchronism, whereas regularity resulted from either asynchronous locally alternating or continual regular AS flowering. A relevant and time-saving method, based on a posteriori sampling of axes and statistical indices is proposed, which is efficient to evaluate the tree breeding values for flowering regularity and could be transferred to other species.

  9. Global biodiversity loss: Exaggerated versus realistic estimates

    Directory of Open Access Journals (Sweden)

    John C. Briggs

    2016-06-01

    Full Text Available For the past 50 years, the public has been made to feel guilty about the tragedy of human-caused biodiversity loss due to the extinction of hundreds or thousands of species every year. Numerous articles and books from the scientific and popular press and publicity on the internet have contributed to a propaganda wave about our grievous loss and the beginning of a sixth mass extinction. However, within the past few years, questions have arisen about the validity of the data which led to the doom scenario. Here I show that, for the past 500 years, terrestrial animals (insects and vertebrates have been losing less than two species per year due to human causes. The majority of the extinctions have occurred on oceanic islands with little effect on continental ecology. In the marine environment, losses have also been very low. At the same time, speciation has continued to occur and biodiversity gain by this means may have equaled or even surpassed the losses. While species loss is not, so far, a global conservation problem, ongoing population declines within thousands of species that are at risk on land and in the sea constitute an extinction debt that will be paid unless those species can be rescued.

  10. Entrepreneurial Education: A Realistic Alternative for Women and Minorities.

    Science.gov (United States)

    Steward, James F.; Boyd, Daniel R.

    1989-01-01

    Entrepreneurial education is a valid, realistic occupational training alternative for minorities and women in business. Entrepreneurship requires that one become involved with those educational programs that contribute significantly to one's success. (Author)

  11. Student Work Experience: A Realistic Approach to Merchandising Education.

    Science.gov (United States)

    Horridge, Patricia; And Others

    1980-01-01

    Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)

  12. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  13. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  14. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  15. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  16. Physisorption and desorption of H2, HD and D2 on amorphous solid water ice. Effect on mixing isotopologue on statistical population of adsorption sites.

    Science.gov (United States)

    Amiaud, Lionel; Fillion, Jean-Hugues; Dulieu, François; Momeni, Anouchah; Lemaire, Jean-Louis

    2015-11-28

    We study the adsorption and desorption of three isotopologues of molecular hydrogen mixed on 10 ML of porous amorphous water ice (ASW) deposited at 10 K. Thermally programmed desorption (TPD) of H2, D2 and HD adsorbed at 10 K have been performed with different mixings. Various coverages of H2, HD and D2 have been explored and a model taking into account all species adsorbed on the surface is presented in detail. The model we propose allows to extract the parameters required to fully reproduce the desorption of H2, HD and D2 for various coverages and mixtures in the sub-monolayer regime. The model is based on a statistical description of the process in a grand-canonical ensemble where adsorbed molecules are described following a Fermi-Dirac distribution.

  17. Visibility in health statistics: a population data linkage study more accurately identifying Aboriginal and Torres Strait Islander Births in Victoria, Australia, 1988-2008

    Directory of Open Access Journals (Sweden)

    Rebecca Ritte

    2017-04-01

    This is the first time that the VPDC and RBDM birth data were linked in Victoria. The matched birth information established a more complete population profile of Aboriginal and/or Torres Strait Islander births. These data will provide a more accurate baseline to enhance the Victorian and Australian governments’ ability to plan services, allocate resources and evaluate funded activities aimed at eliminating disparity experienced by Aboriginal and/or Torres Strait Islander peoples. Importantly, it has established a more accurate denominator from which to calculate Aboriginal infant mortality rates for Victoria, Australia. *Until 2009, the mother’s Indigenous identification only was recorded in the VPDC

  18. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  19. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  20. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  1. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  2. Genomic selection and association mapping in rice (Oryza sativa): effect of trait genetic architecture, training population composition, marker number and statistical model on accuracy of rice genomic selection in elite, tropical rice breeding lines.

    Science.gov (United States)

    Spindel, Jennifer; Begum, Hasina; Akdemir, Deniz; Virk, Parminder; Collard, Bertrand; Redoña, Edilberto; Atlin, Gary; Jannink, Jean-Luc; McCouch, Susan R

    2015-02-01

    Genomic Selection (GS) is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its efficacy for breeding inbred lines of rice. We performed a genome-wide association study (GWAS) in conjunction with five-fold GS cross-validation on a population of 363 elite breeding lines from the International Rice Research Institute's (IRRI) irrigated rice breeding program and herein report the GS results. The population was genotyped with 73,147 markers using genotyping-by-sequencing. The training population, statistical method used to build the GS model, number of markers, and trait were varied to determine their effect on prediction accuracy. For all three traits, genomic prediction models outperformed prediction based on pedigree records alone. Prediction accuracies ranged from 0.31 and 0.34 for grain yield and plant height to 0.63 for flowering time. Analyses using subsets of the full marker set suggest that using one marker every 0.2 cM is sufficient for genomic selection in this collection of rice breeding materials. RR-BLUP was the best performing statistical method for grain yield where no large effect QTL were detected by GWAS, while for flowering time, where a single very large effect QTL was detected, the non-GS multiple linear regression method outperformed GS models. For plant height, in which four mid-sized QTL were identified by GWAS, random forest produced the most consistently accurate GS models. Our results suggest that GS, informed by GWAS interpretations of genetic architecture and population structure, could become an effective tool for increasing the efficiency of rice breeding as the costs of genotyping continue to decline.

  3. Genomic selection and association mapping in rice (Oryza sativa: effect of trait genetic architecture, training population composition, marker number and statistical model on accuracy of rice genomic selection in elite, tropical rice breeding lines.

    Directory of Open Access Journals (Sweden)

    Jennifer Spindel

    2015-02-01

    Full Text Available Genomic Selection (GS is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its efficacy for breeding inbred lines of rice. We performed a genome-wide association study (GWAS in conjunction with five-fold GS cross-validation on a population of 363 elite breeding lines from the International Rice Research Institute's (IRRI irrigated rice breeding program and herein report the GS results. The population was genotyped with 73,147 markers using genotyping-by-sequencing. The training population, statistical method used to build the GS model, number of markers, and trait were varied to determine their effect on prediction accuracy. For all three traits, genomic prediction models outperformed prediction based on pedigree records alone. Prediction accuracies ranged from 0.31 and 0.34 for grain yield and plant height to 0.63 for flowering time. Analyses using subsets of the full marker set suggest that using one marker every 0.2 cM is sufficient for genomic selection in this collection of rice breeding materials. RR-BLUP was the best performing statistical method for grain yield where no large effect QTL were detected by GWAS, while for flowering time, where a single very large effect QTL was detected, the non-GS multiple linear regression method outperformed GS models. For plant height, in which four mid-sized QTL were identified by GWAS, random forest produced the most consistently accurate GS models. Our results suggest that GS, informed by GWAS interpretations of genetic architecture and population structure, could become an effective tool for increasing the efficiency of rice breeding as the costs of genotyping continue to decline.

  4. Genomic Selection and Association Mapping in Rice (Oryza sativa): Effect of Trait Genetic Architecture, Training Population Composition, Marker Number and Statistical Model on Accuracy of Rice Genomic Selection in Elite, Tropical Rice Breeding Lines

    Science.gov (United States)

    Spindel, Jennifer; Begum, Hasina; Akdemir, Deniz; Virk, Parminder; Collard, Bertrand; Redoña, Edilberto; Atlin, Gary; Jannink, Jean-Luc; McCouch, Susan R.

    2015-01-01

    Genomic Selection (GS) is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its efficacy for breeding inbred lines of rice. We performed a genome-wide association study (GWAS) in conjunction with five-fold GS cross-validation on a population of 363 elite breeding lines from the International Rice Research Institute's (IRRI) irrigated rice breeding program and herein report the GS results. The population was genotyped with 73,147 markers using genotyping-by-sequencing. The training population, statistical method used to build the GS model, number of markers, and trait were varied to determine their effect on prediction accuracy. For all three traits, genomic prediction models outperformed prediction based on pedigree records alone. Prediction accuracies ranged from 0.31 and 0.34 for grain yield and plant height to 0.63 for flowering time. Analyses using subsets of the full marker set suggest that using one marker every 0.2 cM is sufficient for genomic selection in this collection of rice breeding materials. RR-BLUP was the best performing statistical method for grain yield where no large effect QTL were detected by GWAS, while for flowering time, where a single very large effect QTL was detected, the non-GS multiple linear regression method outperformed GS models. For plant height, in which four mid-sized QTL were identified by GWAS, random forest produced the most consistently accurate GS models. Our results suggest that GS, informed by GWAS interpretations of genetic architecture and population structure, could become an effective tool for increasing the efficiency of rice breeding as the costs of genotyping continue to decline. PMID:25689273

  5. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1987-01-01

    In-depth exploration of the implications of carrier populations and Fermi energies examines distribution of electrons in energy bands and impurity levels of semiconductors. Also: kinetics of semiconductors containing excess carriers, particularly in terms of trapping, excitation, and recombination.

  6. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  7. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  8. The relative greenhouse gas impacts of realistic dietary choices

    International Nuclear Information System (INIS)

    Berners-Lee, M.; Hoolohan, C.; Cammack, H.; Hewitt, C.N.

    2012-01-01

    The greenhouse gas (GHG) emissions embodied in 61 different categories of food are used, with information on the diet of different groups of the population (omnivorous, vegetarian and vegan), to calculate the embodied GHG emissions in different dietary scenarios. We calculate that the embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 , or 2.7 t CO 2 e person −1 y −1 . This gives total food-related GHG emissions of 167 Mt CO 2 e (1 Mt=10 6 metric tonnes; CO 2 e being the mass of CO 2 that would have the same global warming potential, when measured over 100 years, as a given mixture of greenhouse gases) for the entire UK population in 2009. This is 27% of total direct GHG emissions in the UK, or 19% of total GHG emissions from the UK, including those embodied in goods produced abroad. We calculate that potential GHG savings of 22% and 26% can be made by changing from the current UK-average diet to a vegetarian or vegan diet, respectively. Taking the average GHG saving from six vegetarian or vegan dietary scenarios compared with the current UK-average diet gives a potential national GHG saving of 40 Mt CO 2 e y −1 . This is equivalent to a 50% reduction in current exhaust pipe emissions from the entire UK passenger car fleet. Hence realistic choices about diet can make substantial differences to embodied GHG emissions. - Highlights: ► We calculate the greenhouse gas emissions embodied in different diets. ► The embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 . ► Changing to a vegetarian or vegan diet reduces GHG emissions by 22–26%. ► Changing to a vegetarian or vegan diet would reduce UK GHG emissions by 40 Mt CO 2 e y −1 .

  9. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  10. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  11. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  12. Generating Geospatially Realistic Driving Patterns Derived From Clustering Analysis Of Real EV Driving Data

    DEFF Research Database (Denmark)

    Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob

    2014-01-01

    In order to provide a vehicle fleet that realistically represents the predicted Electric Vehicle (EV) penetration for the future, a model is required that mimics people driving behaviour rather than simply playing back collected data. When the focus is broadened from on a traditional user...... scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...

  13. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  14. TOWARD CHARACTERIZATION OF THE TYPE IIP SUPERNOVA PROGENITOR POPULATION: A STATISTICAL SAMPLE OF LIGHT CURVES FROM Pan-STARRS1

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, N. E.; Soderberg, A. M.; Chornock, R.; Berger, E.; Challis, P.; Drout, M.; Kirshner, R. P.; Lunnan, R.; Marion, G. H.; Margutti, R.; McKinnon, R.; Milisavljevic, D. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Gezari, S. [Department of Astronomy, University of Maryland, College Park, MD 20742-2421 (United States); Betancourt, M. [Department of Statistics, University of Warwick, Coventry (United Kingdom); Foley, R. J. [Astronomy Department, University of Illinois at Urbana-Champaign, 1002 West Green Street, Urbana, IL 61801 (United States); Narayan, G. [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States); Rest, A. [Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Kankare, E.; Mattila, S. [Finnish Centre for Astronomy with ESO (FINCA), University of Turku, Väisäläntie 20, 21500 Piikkiö (Finland); Smartt, S. J., E-mail: nsanders@cfa.harvard.edu [Astrophysics Research Centre, School of Mathematics and Physics, Queens University, BT7 1NN, Belfast (United Kingdom); and others

    2015-02-01

    In recent years, wide-field sky surveys providing deep multiband imaging have presented a new path for indirectly characterizing the progenitor populations of core-collapse supernovae (SNe): systematic light-curve studies. We assemble a set of 76 grizy-band Type IIP SN light curves from Pan-STARRS1, obtained over a constant survey program of 4 yr and classified using both spectroscopy and machine-learning-based photometric techniques. We develop and apply a new Bayesian model for the full multiband evolution of each light curve in the sample. We find no evidence of a subpopulation of fast-declining explosions (historically referred to as ''Type IIL'' SNe). However, we identify a highly significant relation between the plateau phase decay rate and peak luminosity among our SNe IIP. These results argue in favor of a single parameter, likely determined by initial stellar mass, predominantly controlling the explosions of red supergiants. This relation could also be applied for SN cosmology, offering a standardizable candle good to an intrinsic scatter of ≲ 0.2 mag. We compare each light curve to physical models from hydrodynamic simulations to estimate progenitor initial masses and other properties of the Pan-STARRS1 Type IIP SN sample. We show that correction of systematic discrepancies between modeled and observed SN IIP light-curve properties and an expanded grid of progenitor properties are needed to enable robust progenitor inferences from multiband light-curve samples of this kind. This work will serve as a pathfinder for photometric studies of core-collapse SNe to be conducted through future wide-field transient searches.

  15. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  16. Problem Posing with Realistic Mathematics Education Approach in Geometry Learning

    Science.gov (United States)

    Mahendra, R.; Slamet, I.; Budiyono

    2017-09-01

    One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.

  17. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  18. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  19. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  20. Comparative study of the effectiveness of three learning environments: Hyper-realistic virtual simulations, traditional schematic simulations and traditional laboratory

    Directory of Open Access Journals (Sweden)

    Maria Isabel Suero

    2011-10-01

    Full Text Available This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output. This new virtual environment concept, which we call hyper-realistic, transcends basic schematic simulation; it provides the user with a more realistic perception of a physical phenomenon being simulated. We compared the learning achievements of three equivalent, homogeneous groups of undergraduates—an experimental group who used only the hyper-realistic virtual laboratory, a first control group who used a schematic simulation, and a second control group who used the traditional laboratory. The three groups received the same theoretical preparation and carried out equivalent practicals in their respective learning environments. The topic chosen for the experiment was optical aberrations. An analysis of variance applied to the data of the study demonstrated a statistically significant difference (p value <0.05 between the three groups. The learning achievements attained by the group using the hyper-realistic virtual environment were 6.1 percentage points higher than those for the group using the traditional schematic simulations and 9.5 percentage points higher than those for the group using the traditional laboratory.

  1. Environmental accounting and statistics

    International Nuclear Information System (INIS)

    Bartelmus, P.L.P.

    1992-01-01

    The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs

  2. Evaluation of photovoltaic panel temperature in realistic scenarios

    International Nuclear Information System (INIS)

    Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang

    2016-01-01

    Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.

  3. Fatigue - determination of a more realistic usage factor

    International Nuclear Information System (INIS)

    Lang, H.

    2001-01-01

    The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application

  4. Putting a Realistic Theory of Mind into Agency Theory

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stea, Diego

    2014-01-01

    Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management...

  5. Comparison of temporal realistic telecommunication base station exposure with worst-case estimation in two countries

    International Nuclear Information System (INIS)

    Mahfouz, Z.; Verloock, L.; Joseph, W.; Tanghe, E.; Gati, A.; Wiart, J.; Lautru, D.; Hanna, V. F.; Martens, L.

    2013-01-01

    The influence of temporal daily exposure to global system for mobile communications (GSM) and universal mobile telecommunications systems and high speed down-link packet access (UMTS-HSDPA) is investigated using spectrum analyser measurements in two countries, France and Belgium. Temporal variations and traffic distributions are investigated. Three different methods to estimate maximal electric-field exposure are compared. The maximal realistic (99 %) and the maximal theoretical extrapolation factor used to extrapolate the measured broadcast control channel (BCCH) for GSM and the common pilot channel (CPICH) for UMTS are presented and compared for the first time in the two countries. Similar conclusions are found in the two countries for both urban and rural areas: worst-case exposure assessment overestimates realistic maximal exposure up to 5.7 dB for the considered example. In France, the values are the highest, because of the higher population density. The results for the maximal realistic extrapolation factor at the weekdays are similar to those from weekend days. (authors)

  6. Comparison of temporal realistic telecommunication base station exposure with worst-case estimation in two countries.

    Science.gov (United States)

    Mahfouz, Zaher; Verloock, Leen; Joseph, Wout; Tanghe, Emmeric; Gati, Azeddine; Wiart, Joe; Lautru, David; Hanna, Victor Fouad; Martens, Luc

    2013-12-01

    The influence of temporal daily exposure to global system for mobile communications (GSM) and universal mobile telecommunications systems and high speed downlink packet access (UMTS-HSDPA) is investigated using spectrum analyser measurements in two countries, France and Belgium. Temporal variations and traffic distributions are investigated. Three different methods to estimate maximal electric-field exposure are compared. The maximal realistic (99 %) and the maximal theoretical extrapolation factor used to extrapolate the measured broadcast control channel (BCCH) for GSM and the common pilot channel (CPICH) for UMTS are presented and compared for the first time in the two countries. Similar conclusions are found in the two countries for both urban and rural areas: worst-case exposure assessment overestimates realistic maximal exposure up to 5.7 dB for the considered example. In France, the values are the highest, because of the higher population density. The results for the maximal realistic extrapolation factor at the weekdays are similar to those from weekend days.

  7. The Statistical Fermi Paradox

    Science.gov (United States)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  8. Acute exposure to realistic acid fog: effects on respiratory function and airway responsiveness in asthmatics.

    OpenAIRE

    Leduc, Dimitri; Fally, Sophie; De Vuyst, Paul; Wollast, Roland; Yernault, Jean Claude

    1995-01-01

    Naturally occurring fogs in industrialized cities are contaminated by acidic air pollutants. In Brussels, Belgium, the pH of polluted fogwater may be as low as 3 with osmolarity as low as 30 mOsm. In order to explore short-term respiratory effects of a realistic acid-polluted fog, we collected samples of acid fog in Brussels, Belgium, which is a densely populated and industrialized city, we defined characteristics of this fog and exposed asthmatic volunteers at rest through a face mask to fog...

  9. Statistical characterization of wave propagation in mine environments

    KAUST Repository

    Bakir, Onur

    2012-07-01

    A computational framework for statistically characterizing electromagnetic (EM) wave propagation through mine tunnels and galleries is presented. The framework combines a multi-element probabilistic collocation (ME-PC) method with a novel domain-decomposition (DD) integral equation-based EM simulator to obtain statistics of electric fields due to wireless transmitters in realistic mine environments. © 2012 IEEE.

  10. A possible definition of a {\\it Realistic} Physics Theory

    OpenAIRE

    Gisin, Nicolas

    2014-01-01

    A definition of a {\\it Realistic} Physics Theory is proposed based on the idea that, at all time, the set of physical properties possessed (at that time) by a system should unequivocally determine the probabilities of outcomes of all possible measurements.

  11. Evaluation of Highly Realistic Training for Independent Duty Corpsmen Students

    Science.gov (United States)

    2015-05-21

    that he or she can perform desired actions or behaviors ( Bandura , 1977). In the present study, three types of self-efficacy were assessed: general...such as resilience. IDC Highly Realistic Training 10 REFERENCES Bandura , A (1977). Self-efficacy: Toward a unifying theory of behavioral

  12. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  13. Automated Finger Spelling by Highly Realistic 3D Animation

    Science.gov (United States)

    Adamo-Villani, Nicoletta; Beni, Gerardo

    2004-01-01

    We present the design of a new 3D animation tool for self-teaching (signing and reading) finger spelling the first basic component in learning any sign language. We have designed a highly realistic hand with natural animation of the finger motions. Smoothness of motion (in real time) is achieved via programmable blending of animation segments. The…

  14. Creating a Realistic Context for Team Projects in HCI

    NARCIS (Netherlands)

    Koppelman, Herman; van Dijk, Betsy

    2006-01-01

    Team projects are nowadays common practice in HCI education. This paper focuses on the role of clients and users in team projects in introductory HCI courses. In order to provide projects with a realistic context we invite people from industry to serve as clients for the student teams. Some of them

  15. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  16. Empirical Evidence for Niss' "Implemented Anticipation" in Mathematising Realistic Situations

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2012-01-01

    Mathematisation of realistic situations is an on-going focus of research. Classroom data from a Year 9 class participating in a program of structured modelling of real situations was analysed for evidence of Niss's theoretical construct, implemented anticipation, during mathematisation. Evidence was found for two of three proposed aspects. In…

  17. Nuclear properties with realistic Hamiltonians through spectral distribution theory

    International Nuclear Information System (INIS)

    Vary, J.P.; Belehrad, R.; Dalton, B.J.

    1979-01-01

    Motivated by the need of non-perturbative methods for utilizing realistic nuclear Hamiltonians H, the authors use spectral distribution theory, based on calculated moments of H, to obtain specific bulk and valence properties of finite nuclei. The primary emphasis here is to present results for the binding energies of nuclei obtained with and without an assumed core. (Auth.)

  18. Two-Capacitor Problem: A More Realistic View.

    Science.gov (United States)

    Powell, R. A.

    1979-01-01

    Discusses the two-capacitor problem by considering the self-inductance of the circuit used and by determining how well the usual series RC circuit approximates the two-capacitor problem when realistic values of L, C, and R are chosen. (GA)

  19. Rethinking Mathematics Teaching in Liberia: Realistic Mathematics Education

    Science.gov (United States)

    Stemn, Blidi S.

    2017-01-01

    In some African cultures, the concept of division does not necessarily mean sharing money or an item equally. How an item is shared might depend on the ages of the individuals involved. This article describes the use of the Realistic Mathematics Education (RME) approach to teach division word problems involving money in a 3rd-grade class in…

  20. Improving Mathematics Teaching in Kindergarten with Realistic Mathematical Education

    Science.gov (United States)

    Papadakis, Stamatios; Kalogiannakis, Michail; Zaranis, Nicholas

    2017-01-01

    The present study investigates and compares the influence of teaching Realistic Mathematics on the development of mathematical competence in kindergarten. The sample consisted of 231 Greek kindergarten students. For the implementation of the survey, we conducted an intervention, which included one experimental and one control group. Children in…

  1. Towards a Realist Sociology of Education: A Polyphonic Review Essay

    Science.gov (United States)

    Grenfell, Michael; Hood, Susan; Barrett, Brian D.; Schubert, Dan

    2017-01-01

    This review essay evaluates Karl Maton's "Knowledge and Knowers: Towards a Realist Sociology of Education" as a recent examination of the sociological causes and effects of education in the tradition of the French social theorist Pierre Bourdieu and the British educational sociologist Basil Bernstein. Maton's book synthesizes the…

  2. Principles of maximally classical and maximally realistic quantum ...

    Indian Academy of Sciences (India)

    Principles of maximally classical and maximally realistic quantum mechanics. S M ROY. Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India. Abstract. Recently Auberson, Mahoux, Roy and Singh have proved a long standing conjecture of Roy and Singh: In 2N-dimensional phase space, ...

  3. Place of a Realistic Teacher Education Pedagogy in an ICT ...

    African Journals Online (AJOL)

    This article is based on a study undertaken to examine the impact of introducing a realistic teacher education pedagogy (RTEP) oriented learning environment supported by ICT on distance teacher education in Uganda. It gives an overview of the quality, quantity and training of teachers in primary and secondary schools

  4. Elements of a realistic 17 GHz FEL/TBA design

    International Nuclear Information System (INIS)

    Hopkins, D.B.; Halbach, K.; Hoyer, E.H.; Sessler, A.M.; Sternbach, E.J.

    1989-01-01

    Recently, renewed interest in an FEL version of a two-beam accelerator (TBA) has prompted a study of practical system and structure designs for achieving the specified physics goals. This paper presents elements of a realistic design for an FEL/TBA suitable for a 1 TeV, 17 GHz linear collider. 13 refs., 8 figs., 2 tabs

  5. International Management: Creating a More Realistic Global Planning Environment.

    Science.gov (United States)

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  6. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  7. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  8. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  9. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  10. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  11. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  12. Application of statistical shape analysis for the estimation of bone and forensic age using the shapes of the 2nd, 3rd, and 4th cervical vertebrae in a young Japanese population.

    Science.gov (United States)

    Rhee, Chang-Hoon; Shin, Sang Min; Choi, Yong-Seok; Yamaguchi, Tetsutaro; Maki, Koutaro; Kim, Yong-Il; Kim, Seong-Sik; Park, Soo-Byung; Son, Woo-Sung

    2015-12-01

    From computed tomographic images, the dentocentral synchondrosis can be identified in the second cervical vertebra. This can demarcate the border between the odontoid process and the body of the 2nd cervical vertebra and serve as a good model for the prediction of bone and forensic age. Nevertheless, until now, there has been no application of the 2nd cervical vertebra based on the dentocentral synchondrosis. In this study, statistical shape analysis was used to build bone and forensic age estimation regression models. Following the principles of statistical shape analysis and principal components analysis, we used cone-beam computed tomography (CBCT) to evaluate a Japanese population (35 males and 45 females, from 5 to 19 years old). The narrowest prediction intervals among the multivariate regression models were 19.63 for bone age and 2.99 for forensic age. There was no significant difference between form space and shape space in the bone and forensic age estimation models. However, for gender comparison, the bone and forensic age estimation models for males had the higher explanatory power. This study derived an improved objective and quantitative method for bone and forensic age estimation based on only the 2nd, 3rd and 4th cervical vertebral shapes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  14. Realistic Visualization of Virtual Views and Virtual Cinema

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer...... generated characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human...... beings. Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept...

  15. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated...... characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human beings....... Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept is then gaining consensus...

  16. Role-playing for more realistic technical skills training.

    Science.gov (United States)

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  17. Realistic minimum accident source terms - Evaluation, application, and risk acceptance

    International Nuclear Information System (INIS)

    Angelo, P. L.

    2009-01-01

    The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)

  18. Realistic electricity market simulator for energy and economic studies

    International Nuclear Information System (INIS)

    Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul

    2007-01-01

    Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)

  19. Facilities upgrade for natural forces: traditional vs. realistic approach

    International Nuclear Information System (INIS)

    Terkun, V.

    1985-01-01

    The traditional method utilized for upgrading existing buildings and equipment involves the following steps: performs structural study using finite element analysis and some in situ testing; compare predicted member forces/stresses to material code allowables; determine strengthening schemes for those structural members judged to be weak; estimate cost for required upgrades. This approach will result in structural modifications that are not only conservative but very expensive as well. The realistic structural evaluation approach uses traditional data to predict structural weaknesses as a final step. Next, using considerable information now available for buildings and equipment exposed to natural hazards, engineering judgments about structures being evaluated can be made with a great deal of confidence. This approach does not eliminate conservatism entirely, but it does reduce it to a reasonable and realistic level. As a result, the upgrade cost goes down without compromising the low risk necessary for vital facilities

  20. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  1. Blend Shape Interpolation and FACS for Realistic Avatar

    Science.gov (United States)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  2. Fully Realistic Multi-Criteria Multi-Modal Routing

    OpenAIRE

    Gündling, Felix; Keyhani, Mohammad Hossein; Schnee, Mathias; Weihe, Karsten

    2014-01-01

    We report on a multi-criteria search system, in which the German long- and short-distance trains, local public transport, walking, private car, private bike, and taxi are incorporated. The system is fully realistic. Three optimization criteria are addressed: travel time, travel cost, and convenience. Our algorithmic approach computes a complete Pareto set of reasonable connections. The computational study demonstrates that, even in such a large-scale, highly complex scenario, approp...

  3. Realistically Rendering SoC Traffic Patterns with Interrupt Awareness

    DEFF Research Database (Denmark)

    Angiolini, Frederico; Mahadevan, Sharkar; Madsen, Jan

    2005-01-01

    to generate realistic test traffic. This paper presents a selection of applications using interrupt-based synchronization; a reference methodology to split such applications in execution subflows and to adjust the overall execution stream based upon hardware events; a reactive simulation device capable...... of correctly replicating such software behaviours in the MPSoC design phase. Additionally, we validate the proposed concept by showing cycle-accurate reproduction of a previously traced application flow....

  4. Realistic modeling of chamber transport for heavy-ion fusion

    International Nuclear Information System (INIS)

    Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.

    2003-01-01

    Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions

  5. A scan for models with realistic fermion mass patterns

    International Nuclear Information System (INIS)

    Bijnens, J.; Wetterich, C.

    1986-03-01

    We consider models which have no small Yukawa couplings unrelated to symmetry. This situation is generic in higher dimensional unification where Yukawa couplings are predicted to have strength similar to the gauge couplings. Generations have then to be differentiated by symmetry properties and the structure of fermion mass matrices is given in terms of quantum numbers alone. We scan possible symmetries leading to realistic mass matrices. (orig.)

  6. Bell Operator Method to Classify Local Realistic Theories

    International Nuclear Information System (INIS)

    Nagata, Koji

    2010-01-01

    We review the historical fact of multipartite Bell inequalities with an arbitrary number of settings. An explicit local realistic model for the values of a correlation function, given in a two-setting Bell experiment (two-setting model), works only for the specific set of settings in the given experiment, but cannot construct a local realistic model for the values of a correlation function, given in a continuous-infinite settings Bell experiment (infinite-setting model), even though there exist two-setting models for all directions in space. Hence, the two-setting model does not have the property that the infinite-setting model has. Here, we show that an explicit two-setting model cannot construct a local realistic model for the values of a correlation function, given in an M-setting Bell experiment (M-setting model), even though there exist two-setting models for the M measurement directions chosen in the given M-setting experiment. Hence, the two-setting model does not have the property that the M-setting model has. (general)

  7. I-Love relations for incompressible stars and realistic stars

    Science.gov (United States)

    Chan, T. K.; Chan, AtMa P. O.; Leung, P. T.

    2015-02-01

    In spite of the diversity in the equations of state of nuclear matter, the recently discovered I-Love-Q relations [Yagi and Yunes, Science 341, 365 (2013), 10.1126/science.1236462], which relate the moment of inertia, tidal Love number (deformability), and the spin-induced quadrupole moment of compact stars, hold for various kinds of realistic neutron stars and quark stars. While the physical origin of such universality is still a current issue, the observation that the I-Love-Q relations of incompressible stars can well approximate those of realistic compact stars hints at a new direction to approach the problem. In this paper, by establishing recursive post-Minkowskian expansion for the moment of inertia and the tidal deformability of incompressible stars, we analytically derive the I-Love relation for incompressible stars and show that the so-obtained formula can be used to accurately predict the behavior of realistic compact stars from the Newtonian limit to the maximum mass limit.

  8. Realistic terrain visualization based on 3D virtual world technology

    Science.gov (United States)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  9. Effects of realistic force feedback in a robotic assisted minimally invasive surgery system.

    Science.gov (United States)

    Moradi Dalvand, Mohsen; Shirinzadeh, Bijan; Nahavandi, Saeid; Smith, Julian

    2014-06-01

    Robotic assisted minimally invasive surgery systems not only have the advantages of traditional laparoscopic procedures but also restore the surgeon's hand-eye coordination and improve the surgeon's precision by filtering hand tremors. Unfortunately, these benefits have come at the expense of the surgeon's ability to feel. Several research efforts have already attempted to restore this feature and study the effects of force feedback in robotic systems. The proposed methods and studies have some shortcomings. The main focus of this research is to overcome some of these limitations and to study the effects of force feedback in palpation in a more realistic fashion. A parallel robot assisted minimally invasive surgery system (PRAMiSS) with force feedback capabilities was employed to study the effects of realistic force feedback in palpation of artificial tissue samples. PRAMiSS is capable of actually measuring the tip/tissue interaction forces directly from the surgery site. Four sets of experiments using only vision feedback, only force feedback, simultaneous force and vision feedback and direct manipulation were conducted to evaluate the role of sensory feedback from sideways tip/tissue interaction forces with a scale factor of 100% in characterising tissues of varying stiffness. Twenty human subjects were involved in the experiments for at least 1440 trials. Friedman and Wilcoxon signed-rank tests were employed to statistically analyse the experimental results. Providing realistic force feedback in robotic assisted surgery systems improves the quality of tissue characterization procedures. Force feedback capability also increases the certainty of characterizing soft tissues compared with direct palpation using the lateral sides of index fingers. The force feedback capability can improve the quality of palpation and characterization of soft tissues of varying stiffness by restoring sense of touch in robotic assisted minimally invasive surgery operations.

  10. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  11. Childhood disability population-based surveillance: Assessment of the Ages and Stages Questionnaire Third Edition and Washington Group on Disability Statistics/UNICEF module on child functioning in a rural setting in South Africa.

    Science.gov (United States)

    Visser, Marieta; Nel, Mariette; Bronkhorst, Caretha; Brown, Lara; Ezendam, Zaskia; Mackenzie, Kira; van der Merwe, Deidré; Venter, Marné

    2016-01-01

    Epidemiological information on childhood disability provides the basis for a country to plan, implement and manage the provision of health, educational and social services for these vulnerable children. There is, however, currently no population-based surveillance instrument that is compatible with the International Classification of Functioning, Disability and Health (ICF), internationally comparable, methodologically sound and comprehensively researched, to identify children under 5 years of age who are living with disability in South Africa and internationally. We conducted a descriptive pilot study to investigate the sensitivity and specificity of translated versions of the Ages and Stages Questionnaire Third Edition (ASQ-III) and the Washington Group on Disability Statistics/UNICEF module on child functioning (WG/UNICEF module) as parent-reported measures. The aim of our study was to identify early childhood disabilities in children aged 24-48 months in a rural area of South Africa, to determine the appropriateness of these instruments for population-based surveillance in similar contexts internationally. This study was conducted in the Xhariep District of the Free State Province in central South Africa, with 50 carers whose children were registered on the South African Social Security Agency (SASSA) database as recipients of a grant for one of the following: Care Dependency, Child Support or Foster Care. The researchers, assisted by community healthcare workers and SASSA staff members, conducted structured interviews using forward-backward translated versions of the ASQ-III and the WG/UNICEF module. Both measurement instruments had a clinically meaningful sensitivity of 60.0%, high specificity of 95.6% for the ASQ-III and 84.4% for the WG/UNICEF module, and the two instruments agreed moderately (Kappa = 0.6). Since the WG/UNICEF module is quicker to administer, easier to understand and based on the ICF, it can be considered as an appropriate parent

  12. Linking macroscopic with microscopic neuroanatomy using synthetic neuronal populations.

    Science.gov (United States)

    Schneider, Calvin J; Cuntz, Hermann; Soltesz, Ivan

    2014-10-01

    Dendritic morphology has been shown to have a dramatic impact on neuronal function. However, population features such as the inherent variability in dendritic morphology between cells belonging to the same neuronal type are often overlooked when studying computation in neural networks. While detailed models for morphology and electrophysiology exist for many types of single neurons, the role of detailed single cell morphology in the population has not been studied quantitatively or computationally. Here we use the structural context of the neural tissue in which dendritic trees exist to drive their generation in silico. We synthesize the entire population of dentate gyrus granule cells, the most numerous cell type in the hippocampus, by growing their dendritic trees within their characteristic dendritic fields bounded by the realistic structural context of (1) the granule cell layer that contains all somata and (2) the molecular layer that contains the dendritic forest. This process enables branching statistics to be linked to larger scale neuroanatomical features. We find large differences in dendritic total length and individual path length measures as a function of location in the dentate gyrus and of somatic depth in the granule cell layer. We also predict the number of unique granule cell dendrites invading a given volume in the molecular layer. This work enables the complete population-level study of morphological properties and provides a framework to develop complex and realistic neural network models.

  13. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  14. Percolation dans des reseaux realistes de nanostructures de carbone

    Science.gov (United States)

    Simoneau, Louis-Philippe

    versatility in the choice of network components that can be simulated. The tools we have developed, grouped together in the RPH-HPN software Reseaux percolatifs hybrides - Hybrid Percolation Networks, construct random networks, detect contact between the tubes, translate the systems to equivalent electrical circuits and calculate global properties. An infinity of networks can have the same basic characteristics (size, diameter, etc.) and therefore the properties of a particular random network are not necessarily representative of the average properties of all networks. To obtain those general properties, we simulate a large number of random networks with the same basic characteristics and the average of the quantities is determined. The network constituent elements can be spheres, rods or snakes. The use of such geometries for network elements makes contact detection simple and quick, and more faithfully reproduce the form of carbon nanotubes. We closely monitor the geometrical and electrical properties of these elements through stochastic distributions of our choice. We can choose the length, diameter, orientation, chirality, tortuosity and impenetrable nature of the elements in order to properly reproduce real networks characteristics. We have considered statistical distribution functions that are rectangular, Gaussian, and Lorentzian, but all other distributions that can be expressed mathematically can also be envisioned. During the creation of a particular network, we generate the elements one by one. Each of their properties is sampled from a preselected distribution. Efficient algorithms used in various fields were adapted to our needs to manage the detection of contacts, clusters and percolation. In addition, we model more realistic contact between rigid nanotubes using an original method used to create the network that does not require a relaxation phase. Finally, we use Kirchhoff's laws to solve the equivalent electrical circuit conventionally. First, we evaluated

  15. Statistical and quantitative research

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Environmental impacts may escape detection if the statistical tests used to analyze data from field studies are inadequate or the field design is not appropriate. To alleviate this problem, PNL scientists are doing theoretical research which will provide the basis for new sampling schemes or better methods to analyze and present data. Such efforts have resulted in recommendations about the optimal size of study plots, sampling intensity, field replication, and program duration. Costs associated with any of these factors can be substantial if, for example, attention is not paid to the adequacy of a sampling scheme. In the study of dynamics of large-mammal populations, the findings are sometimes surprising. For example, the survival of a grizzly bear population may hinge on the loss of one or two adult females per year

  16. Toxicity effects of an environmental realistic herbicide mixture on the seagrass Zostera noltei.

    Science.gov (United States)

    Diepens, Noël J; Buffan-Dubau, Evelyne; Budzinski, Hélène; Kallerhoff, Jean; Merlina, Georges; Silvestre, Jérome; Auby, Isabelle; Nathalie Tapie; Elger, Arnaud

    2017-03-01

    Worldwide seagrass declines have been observed due to multiple stressors. One of them is the mixture of pesticides used in intensive agriculture and boat antifouling paints in coastal areas. Effects of mixture toxicity are complex and poorly understood. However, consideration of mixture toxicity is more realistic and ecologically relevant for environmental risk assessment (ERA). The first aim of this study was to determine short-term effects of realistic herbicide mixture exposure on physiological endpoints of Zostera noltei. The second aim was to assess the environmental risks of this mixture, by comparing the results to previously published data. Z. noltei was exposed to a mixture of four herbicides: atrazine, diuron, irgarol and S-metolachlor, simulating the composition of typical cocktail of contaminants in the Arcachon bay (Atlantic coast, France). Three stress biomarkers were measured: enzymatic activity of glutathione reductase, effective quantum yield (EQY) and photosynthetic pigment composition after 6, 24 and 96 h. Short term exposure to realistic herbicide mixtures affected EQY, with almost 100% inhibition for the two highest concentrations, and photosynthetic pigments. Effect on pigment composition was detected after 6 h with a no observed effect concentration (NOEC) of 1 μg/L total mixture concentration. The lowest EQY effect concentration at 10% (EC 10 ) (2 μg/L) and pigment composition NOEC with an assessment factor of 10 were above the maximal field concentrations along the French Atlantic coast, suggesting that there are no potential short term adverse effects of this particular mixture on Z. noltei. However, chronic effects on photosynthesis may lead to reduced energy reserves, which could thus lead to effects at whole plant and population level. Understanding the consequences of chemical mixtures could help to improve ERA and enhance management strategies to prevent further declines of seagrass meadows worldwide. Copyright © 2016

  17. The effect of problem posing and problem solving with realistic mathematics education approach to the conceptual understanding and adaptive reasoning

    Science.gov (United States)

    Mahendra, Rengga; Slamet, Isnandar; Budiyono

    2017-12-01

    One of the difficulties of students in learning mathematics is on the subject of geometry that requires students to understand abstract things. The aim of this research is to determine the effect of learning model Problem Posing and Problem Solving with Realistic Mathematics Education Approach to conceptual understanding and students' adaptive reasoning in learning mathematics. This research uses a kind of quasi experimental research. The population of this research is all seventh grade students of Junior High School 1 Jaten, Indonesia. The sample was taken using stratified cluster random sampling technique. The test of the research hypothesis was analyzed by using t-test. The results of this study indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students' conceptual understanding significantly in mathematics learning. In addition tu, the results also showed that the model of Problem Solving learning with Realistic Mathematics Education Approach can improve students' adaptive reasoning significantly in learning mathematics. Therefore, the model of Problem Posing and Problem Solving learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on the subject of geometry so as to improve conceptual understanding and students' adaptive reasoning. Furthermore, the impact can improve student achievement.

  18. Gauge coupling unification in realistic free-fermionic string models

    International Nuclear Information System (INIS)

    Dienes, K.R.; Faraggi, A.E.

    1995-01-01

    We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)

  19. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  20. Performance Analysis of Relays in LTE for a Realistic Suburban Deployment Scenario

    DEFF Research Database (Denmark)

    Coletti, Claudio; Mogensen, Preben; Irmer, Ralf

    2011-01-01

    Relays are likely to play an important role in the deployment of Beyond 3G networks, such as LTE-Advanced, thanks to the possibility of effectively extending Macro network coverage and fulfilling the expected high data-rate requirements. Up until now, the relay technology potential and its cost......-effectiveness have been widely investigated in the literature, considering mainly statistical deployment scenarios, like regular networks with uniform traffic distribution. This paper is envisaged to illustrate the performances of different relay technologies (In-Band/Out-band) in a realistic suburban network...... scenario with real Macro site positions, user density map and spectrum band availability. Based on a proposed heuristic deployment algorithm, results show that deploying In-band relays can significantly reduce the user outage if high backhaul link quality is ensured, whereas Out-band relaying and the usage...

  1. Ultra-Reliable Communications in Failure-Prone Realistic Networks

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Lauridsen, Mads; Alvarez, Beatriz Soret

    2016-01-01

    We investigate the potential of different diversity and interference management techniques to achieve the required downlink SINR outage probability for ultra-reliable communications. The evaluation is performed in a realistic network deployment based on site-specific data from a European capital....... Micro and macroscopic diversity techniques are proved to be important enablers of ultra-reliable communications. Particularly, it is shown how a 4x4 MIMO scheme with three orders of macroscopic diversity can achieve the required SINR outage performance. Smaller gains are obtained from interference...

  2. Capturing and reproducing realistic acoustic scenes for hearing research

    DEFF Research Database (Denmark)

    Marschall, Marton; Buchholz, Jörg

    Accurate spatial audio recordings are important for a range of applications, from the creation of realistic virtual sound environments to the evaluation of communication devices, such as hearing instruments and mobile phones. Spherical microphone arrays are particularly well-suited for capturing....... The properties of MOA microphone layouts and processing were investigated further by considering several order combinations. It was shown that the performance for horizontal vs. elevated sources can be adjusted by varying the order combination, but that a benefit of the higher horizontal orders can only be seen...

  3. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  4. Dynamic apeerture in damping rings with realistic wigglers

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yunhai; /SLAC

    2005-05-04

    The International Linear Collider based on superconducting RF cavities requires the damping rings to have extremely small equilibrium emittance, huge circumference, fast damping time, and large acceptance. To achieve all of these requirements is a very challenging task. In this paper, we will present a systematic approach to designing the damping rings using simple cells and non-interlaced sextupoles. The designs of the damping rings with various circumferences and shapes, including dogbone, are presented. To model realistic wigglers, we have developed a new hybrid symplectic integrator for faster and accurate evaluation of dynamic aperture of the lattices.

  5. Dynamic Enhanced Inter-Cell Interference Coordination for Realistic Networks

    DEFF Research Database (Denmark)

    Pedersen, Klaus I.; Alvarez, Beatriz Soret; Barcos, Sonia

    2016-01-01

    Enhanced Inter-Cell Interference Coordination (eICIC) is a key ingredient to boost the performance of co-channel Heterogeneous Networks (HetNets). eICIC encompasses two main techniques: Almost Blank Subframes (ABS), during which the macro cell remains silent to reduce the interference, and biased...... and an opportunistic approach exploiting the varying cell conditions. Moreover, an autonomous fast distributed muting algorithm is presented, which is simple, robust, and well suited for irregular network deployments. Performance results for realistic network deployments show that the traditional semi-static e...

  6. Realistic shell-model calculations for Sn isotopes

    International Nuclear Information System (INIS)

    Covello, A.; Andreozzi, F.; Coraggio, L.; Gargano, A.; Porrino, A.

    1997-01-01

    We report on a shell-model study of the Sn isotopes in which a realistic effective interaction derived from the Paris free nucleon-nucleon potential is employed. The calculations are performed within the framework of the seniority scheme by making use of the chain-calculation method. This provides practically exact solutions while cutting down the amount of computational work required by a standard seniority-truncated calculation. The behavior of the energy of several low-lying states in the isotopes with A ranging from 122 to 130 is presented and compared with the experimental one. (orig.)

  7. Turbulence studies in tokamak boundary plasmas with realistic divertor geometry

    International Nuclear Information System (INIS)

    Xu, X.Q.; Cohen, R.H.; Porter, G.D.; Rognlien, T.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.

    2001-01-01

    Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the ExB drift speed, ion diamagnetism and nite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)

  8. Turbulence studies in tokamak boundary plasmas with realistic divertor geometry

    International Nuclear Information System (INIS)

    Xu, X.Q.; Cohen, R.H.; Por, G.D. ter; Rognlien, T.D.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.

    1999-01-01

    Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the E x B drift speed, ion diamagnetism and finite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)

  9. On Small Antenna Measurements in a Realistic MIMO Scenario

    DEFF Research Database (Denmark)

    Yanakiev, Boyan; Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    2010-01-01

    . The problem using coaxial cable is explained and a solution suitable for long distance channel sounding is presented. A large scale measurement campaign is then described. Special attention is paid to bring the measurement setup as close as possible to a realistic LTE network of the future, with attention......This paper deals with the challenges related to evaluating the performance of multiple, small terminal antennas within a natural MIMO environment. The focus is on the antenna measurement accuracy. First a method is presented for measuring small phone mock-ups, with the use of optical fibers...

  10. A continuous family of realistic SUSY SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bajc, Borut, E-mail: borut.bajc@ijs.si [J. Stefan Institute, Jamova cesta 39, 1000, Ljubljana (Slovenia)

    2016-06-21

    It is shown that the minimal renormalizable supersymmetric SU(5) is still realistic providing the supersymmetric scale is at least few tens of TeV or large R-parity violating terms are considered. In the first case the vacuum is metastable, and different consistency constraints can give a bounded allowed region in the tan β − m{sub susy} plane. In the second case the mass eigenstate electron (down quark) is a linear combination of the original electron (down quark) and Higgsino (heavy colour triplet), and the mass ratio of bino and wino is determined. Both limits lead to light gravitino dark matter.

  11. Analytical and statistical consideration on the use of the ISAG-ICAR-SNP bovine panel for parentage control, using the Illumina BeadChip technology: example on the German Holstein population.

    Science.gov (United States)

    Schütz, Ekkehard; Brenig, Bertram

    2015-02-05

    Parentage control is moving from short tandem repeats- to single nucleotide polymorphism (SNP) systems. For SNP-based parentage control in cattle, the ISAG-ICAR Committee proposes a set of 100/200 SNPs but quality criteria are lacking. Regarding German Holstein-Friesian cattle with only a limited number of evaluated individuals, the exclusion probability is not well-defined. We propose a statistical procedure for excluding single SNPs from parentage control, based on case-by-case evaluation of the GenCall score, to minimize parentage exclusion, based on miscalled genotypes. Exclusion power of the ISAG-ICAR SNPs used for the German Holstein-Friesian population was adjusted based on the results of more than 25,000 individuals. Experimental data were derived from routine genomic selection analyses of the German Holstein-Friesian population using the Illumina BovineSNP50 v2 BeadChip (20,000 individuals) or the EuroG10K variant (7000 individuals). Averages and standard deviations of GenCall scores for the 200 SNPs of the ISAG-ICAR recommended panel were calculated and used to calculate the downward Z-value. Based on minor allelic frequencies in the Holstein-Friesian population, one minus exclusion probability was equal to 1.4×10⁻¹⁰ and 7.2×10⁻²⁶, with one and two parents, respectively. Two monomorphic SNPs from the 100-SNP ISAG-ICAR core-panel did not contribute. Simulation of 10,000 parentage control combinations, using the GenCall score data from both BeadChips, showed that with a Z-value greater than 3.66 only about 2.5% parentages were excluded, based on the ISAG-ICAR recommendations (core-panel: ≥ 90 SNPs for one, ≥ 85 SNPs for two parents). When applied to real data from 1750 single parentage assessments, the optimal threshold was determined to be Z = 5.0, with only 34 censored cases and reduction to four (0.2%) doubtful parentages. About 70 parentage exclusions due to weak genotype calls were avoided, whereas true exclusions (n = 34) were

  12. Amostras complexas em inquéritos populacionais: planejamento e implicações na análise estatística dos dados Complex Sampling Design in Population Surveys: Planning and effects on statistical data analysis

    Directory of Open Access Journals (Sweden)

    Célia Landmann Szwarcwald

    2008-05-01

    health status of the population and satisfaction with healthcare from the user's point of view. Most national health surveys do not use simple random sampling, either due to budget restrictions or because time constraints associated with data collection. In general, a combination of several probabilistic sampling methods is used to select a representative sample of the population, which is called complex sampling design. Among the several sampling techniques, the most frequently used are simple random sampling, stratified sampling and cluster sampling. As a result of this process, the next concern is the statistical analysis of the data from complex samples. This paper deals with issues related to data analysis obtained from surveys using complex sampling designs. It discusses the problems that arise when the statistical analysis does not incorporate the sampling design. When the design is neglected, traditional statistical analysis, based on the assumption of simple random sampling, might produce improper results not only for the mean estimates but also for standard errors, thus compromising results, hypothesis testing, and survey conclusions. The World Health Survey (WHS carried out in Brazil, in 2003, is used to exemplify complex sampling methods.

  13. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  14. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  15. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  16. Statistical selection : a way of thinking !

    NARCIS (Netherlands)

    Laan, van der P.; Aarts, E.H.L.; Eikelder, ten H.M.M.; Hemerik, C.; Rem, M.

    1995-01-01

    Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and

  17. Statistical selection : a way of thinking!

    NARCIS (Netherlands)

    Laan, van der P.

    1995-01-01

    Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and

  18. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  19. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  20. Music therapy for palliative care: A realist review.

    Science.gov (United States)

    McConnell, Tracey; Porter, Sam

    2017-08-01

    Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therapy for palliative care. In keeping with the realist approach, we examined all relevant literature to develop theories that could explain how music therapy works. A total of 51 articles were included in the review. Music therapy was found to have a therapeutic effect on the physical, psychological, emotional, and spiritual suffering of palliative care patients. We also identified program mechanisms that help explain music therapy's therapeutic effects, along with facilitating contexts for implementation. Music therapy may be an effective nonpharmacological approach to managing distressing symptoms in palliative care patients. The findings also suggest that group music therapy may be a cost-efficient and effective way to support staff caring for palliative care patients. We encourage others to continue developing the evidence base in order to expand our understanding of how music therapy works, with the aim of informing and improving the provision of music therapy for palliative care patients.

  1. Report of the workshop on realistic SSC lattices

    International Nuclear Information System (INIS)

    1985-10-01

    A workshop was held at the SSC Central Design Group from May 29 to June 4, 1985, on topics relating to the lattice of the SSC. The workshop marked a shift of emphasis from the investigation of simplified test lattices to the development of a realistic lattice suitable for the conceptual design report. The first day of the workshop was taken up by reviews of accelerator system requirements, of the reference design solutions for these requirements, of lattice work following the reference design, and of plans for the workshop. The work was divided among four working groups. The first, chaired by David Douglas, concerned the arcs of regular cells. The second group, which studied the utility insertions, was chaired by Beat Leemann. The third group, under David E. Johnson, concerned itself with the experimental insertions, dispersion suppressors, and phase trombones. The fourth group, responsible for global lattice considerations and the design of a new realistic lattice example, was led by Ernest Courant. The papers resulting from this workshop are roughly divided into three sets: those relating to specific lattice components, to complete lattices, and to other topics. Among the salient accomplishments of the workshop were additions to and optimization of lattice components, especially those relating to lattices using 1-in-1 magnets, either horizontally or vertically separated, and the design of complete lattice examples. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  2. Evaluating impact of clinical guidelines using a realist evaluation framework.

    Science.gov (United States)

    Reddy, Sandeep; Wakerman, John; Westhorp, Gill; Herring, Sally

    2015-12-01

    The Remote Primary Health Care Manuals (RPHCM) project team manages the development and publication of clinical protocols and procedures for primary care clinicians practicing in remote Australia. The Central Australian Rural Practitioners Association Standard Treatment Manual, the flagship manual of the RPHCM suite, has been evaluated for accessibility and acceptability in remote clinics three times in its 20-year history. These evaluations did not consider a theory-based framework or a programme theory, resulting in some limitations with the evaluation findings. With the RPHCM having an aim of enabling evidence-based practice in remote clinics and anecdotally reported to do so, testing this empirically for the full suite is vital for both stakeholders and future editions of the RPHCM. The project team utilized a realist evaluation framework to assess how, why and for what the RPHCM were being used by remote practitioners. A theory regarding the circumstances in which the manuals have and have not enabled evidence-based practice in the remote clinical context was tested. The project assessed this theory for all the manuals in the RPHCM suite, across government and aboriginal community-controlled clinics, in three regions of Australia. Implementing a realist evaluation framework to generate robust findings in this context has required innovation in the evaluation design and adaptation by researchers. This article captures the RPHCM team's experience in designing this evaluation. © 2015 John Wiley & Sons, Ltd.

  3. Ultra-realistic 3-D imaging based on colour holography

    International Nuclear Information System (INIS)

    Bjelkhagen, H I

    2013-01-01

    A review of recent progress in colour holography is provided with new applications. Colour holography recording techniques in silver-halide emulsions are discussed. Both analogue, mainly Denisyuk colour holograms, and digitally-printed colour holograms are described and their recent improvements. An alternative to silver-halide materials are the panchromatic photopolymer materials such as the DuPont and Bayer photopolymers which are covered. The light sources used to illuminate the recorded holograms are very important to obtain ultra-realistic 3-D images. In particular the new light sources based on RGB LEDs are described. They show improved image quality over today's commonly used halogen lights. Recent work in colour holography by holographers and companies in different countries around the world are included. To record and display ultra-realistic 3-D images with perfect colour rendering are highly dependent on the correct recording technique using the optimal recording laser wavelengths, the availability of improved panchromatic recording materials and combined with new display light sources.

  4. Spectroscopy of light nuclei with realistic NN interaction JISP

    International Nuclear Information System (INIS)

    Shirokov, A. M.; Vary, J. P.; Mazur, A. I.; Weber, T. A.

    2008-01-01

    Recent results of our systematic ab initio studies of the spectroscopy of s- and p-shell nuclei in fully microscopic large-scale (up to a few hundred million basis functions) no-core shell-model calculations are presented. A new high-quality realistic nonlocal NN interaction JISP is used. This interaction is obtained in the J-matrix inverse-scattering approach (JISP stands for the J-matrix inverse-scattering potential) and is of the form of a small-rank matrix in the oscillator basis in each of the NN partial waves, providing a very fast convergence in shell-model studies. The current purely two-body JISP model of the nucleon-nucleon interaction JISP16 provides not only an excellent description of two-nucleon data (deuteron properties and np scattering) with χ 2 /datum = 1.05 but also a better description of a wide range of observables (binding energies, spectra, rms radii, quadrupole moments, electromagnetic-transition probabilities, etc.) in all s-and p-shell nuclei than the best modern interaction models combining realistic nucleon-nucleon and three-nucleon interactions.

  5. Development of vortex model with realistic axial velocity distribution

    International Nuclear Information System (INIS)

    Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki

    2014-01-01

    A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)

  6. Neural Correlates of Realistic and Unrealistic Auditory Space Perception

    Directory of Open Access Journals (Sweden)

    Akiko Callan

    2011-10-01

    Full Text Available Binaural recordings can simulate externalized auditory space perception over headphones. However, if the orientation of the recorder's head and the orientation of the listener's head are incongruent, the simulated auditory space is not realistic. For example, if a person lying flat on a bed listens to an environmental sound that was recorded by microphones inserted in ears of a person who was in an upright position, the sound simulates an auditory space rotated 90 degrees to the real-world horizontal axis. Our question is whether brain activation patterns are different between the unrealistic auditory space (ie, the orientation of the listener's head and the orientation of the recorder's head are incongruent and the realistic auditory space (ie, the orientations are congruent. River sounds that were binaurally recorded either in a supine position or in an upright body position were served as auditory stimuli. During fMRI experiments, participants listen to the stimuli and pressed one of two buttons indicating the direction of the water flow (horizontal/vertical. Behavioral results indicated that participants could not differentiate between the congruent and the incongruent conditions. However, neuroimaging results showed that the congruent condition activated the planum temporale significantly more than the incongruent condition.

  7. An inexpensive yet realistic model for teaching vasectomy

    Directory of Open Access Journals (Sweden)

    Taylor M. Coe

    2015-04-01

    Full Text Available Purpose Teaching the no-scalpel vasectomy is important, since vasectomy is a safe, simple, and cost-effective method of contraception. This minimally invasive vasectomy technique involves delivering the vas through the skin with specialized tools. This technique is associated with fewer complications than the traditional incisional vasectomy (1. One of the most challenging steps is the delivery of the vas through a small puncture in the scrotal skin, and there is a need for a realistic and inexpensive scrotal model for beginning learners to practice this step. Materials and Methods After careful observation using several scrotal models while teaching residents and senior trainees, we developed a simplified scrotal model that uses only three components–bicycle inner tube, latex tubing, and a Penrose drain. Results This model is remarkably realistic and allows learners to practice a challenging step in the no-scalpel vasectomy. The low cost and simple construction of the model allows wide dissemination of training in this important technique. Conclusions We propose a simple, inexpensive model that will enable learners to master the hand movements involved in delivering the vas through the skin while mitigating the risks of learning on patients.

  8. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    When treatment effects of active labour market programmes are heterogeneous in an observable way  across the population, the allocation of the unemployed into different programmes becomes a particularly  important issue. In this paper, we present a statistical model designed to improve the present...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  9. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Burgos, Juan; Mahadevan, Aditya; Manavi, Kasra; Murray, Luke; Kodochygov, Anton; Zourntos, Takis; Amato, Nancy M.

    2011-01-01

    be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility

  10. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  11. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  12. DATA ON YOUTH, 1967, A STATISTICAL DOCUMENT.

    Science.gov (United States)

    SCHEIDER, GEORGE

    THE DATA IN THIS REPORT ARE STATISTICS ON YOUTH THROUGHOUT THE UNITED STATES AND IN NEW YORK STATE. INCLUDED ARE DATA ON POPULATION, SCHOOL STATISTICS, EMPLOYMENT, FAMILY INCOME, JUVENILE DELINQUENCY AND YOUTH CRIME (INCLUDING NEW YORK CITY FIGURES), AND TRAFFIC ACCIDENTS. THE STATISTICS ARE PRESENTED IN THE TEXT AND IN TABLES AND CHARTS. (NH)

  13. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  14. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  15. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  16. Effects of the neonicotinoid pesticide thiamethoxam at field-realistic levels on microcolonies of Bombus terrestris worker bumble bees.

    Science.gov (United States)

    Laycock, Ian; Cotterell, Katie C; O'Shea-Wheller, Thomas A; Cresswell, James E

    2014-02-01

    Neonicotinoid pesticides are currently implicated in the decline of wild bee populations. Bumble bees, Bombus spp., are important wild pollinators that are detrimentally affected by ingestion of neonicotinoid residues. To date, imidacloprid has been the major focus of study into the effects of neonicotinoids on bumble bee health, but wild populations are increasingly exposed to alternative neonicotinoids such as thiamethoxam. To investigate whether environmentally realistic levels of thiamethoxam affect bumble bee performance over a realistic exposure period, we exposed queenless microcolonies of Bombus terrestris L. workers to a wide range of dosages up to 98 μgkg(-1) in dietary syrup for 17 days. Results showed that bumble bee workers survived fewer days when presented with syrup dosed at 98 μg thiamethoxamkg(-1), while production of brood (eggs and larvae) and consumption of syrup and pollen in microcolonies were significantly reduced by thiamethoxam only at the two highest concentrations (39, 98 μgkg(-1)). In contrast, we found no detectable effect of thiamethoxam at levels typically found in the nectars of treated crops (between 1 and 11 μgkg(-1)). By comparison with published data, we demonstrate that during an exposure to field-realistic concentrations lasting approximately two weeks, brood production in worker bumble bees is more sensitive to imidacloprid than thiamethoxam. We speculate that differential sensitivity arises because imidacloprid produces a stronger repression of feeding in bumble bees than thiamethoxam, which imposes a greater nutrient limitation on production of brood. © 2013 Published by Elsevier Inc.

  17. Shot Group Statistics for Small Arms Applications

    Science.gov (United States)

    2017-06-01

    if its probability distribution is known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown... statistical inference on the unknown, population standard deviations of the x and y impact-point positions. The dispersion measures treated in this report...known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown, population standard deviations of the x and y

  18. Generalized Warburg impedance on realistic self-affine fractals ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... We analyse the problem of impedance for a diffusion controlled charge transfer process across an irregular interface. These interfacial irregularities are characterized as two class of random fractals: (i) a statistically isotropic self-affine fractals and (ii) a statistically corrugated self-affine fractals.

  19. Generalized Warburg impedance on realistic self-affine fractals

    Indian Academy of Sciences (India)

    We analyse the problem of impedance for a diffusion controlled charge transfer process across an irregular interface. These interfacial irregularities are characterized as two class of random fractals: (i) a statistically isotropic self-affine fractals and (ii) a statistically corrugated self-affine fractals. The information about the ...

  20. Imaginary populations

    Directory of Open Access Journals (Sweden)

    A. Martínez–Abraín

    2010-01-01

    Full Text Available A few years ago, Camus & Lima (2002 wrote an essay to stimulate ecologists to think about how we define and use a fundamental concept in ecology: the population. They concluded, concurring with Berryman (2002, that a population is "a group of individuals of the same species that live together in an area of sufficient size to permit normal dispersal and/or migration behaviour and in which population changes are largely the results of birth and death processes". They pointed out that ecologists often forget "to acknowledge that many study units are neither natural nor even units in terms of constituting a population system", and hence claimed that we "require much more accuracy than in past decades in order to be more effective to characterize populations and predict their behaviour". They stated that this is especially necessary "in disciplines such as conservation biology or resource pest management, to avoid reaching wrong conclusions or making inappropriate decisions". As a population ecologist and conservation biologist I totally agree with these authors and, like them, I be¬lieve that greater precision and care is needed in the use and definition of ecological terms. The point I wish to stress here is that we ecologists tend to forget that when we use statistical tools to infer results from our sample to a population we work with what statisticians term "imaginary", "hypothetical" or "potential" popula¬tions. As Zar (1999 states, if our sample data consist of 40 measurements of growth rate in guinea pigs "the population about which conclusions might be drawn is the growth rates of all the guinea pigs that conceivably might have been administered the same food supplement under identical conditions". Such a population does not really exist, and hence it is considered a hypothetical or imaginary population. Compare that definition with the population concept that would be in our minds when performing such measurements. We would probably

  1. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  2. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  3. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  4. Radioactive waste management in Brazil: a realistic view

    International Nuclear Information System (INIS)

    Heilbron Filho, Paulo Fernando Lavalle; Perez Guerrero, Jesus Salvador; Xavier, Ana Maria

    2014-01-01

    The objective of this article is to present a realistic view of the main issues related to the management of radioactive waste in Brazil as well as a comprehensive picture of the regulatory waste management status in the country and internationally. Technical aspects that must be considered to ensure a safe construction of near surface disposal facilities for radioactive waste of low and medium levels of radiation are addressed. Different types of deposits, the basic regulatory issues involving the licensing of these facilities, the development of a financial compensation model for the Brazilian Municipalities where deposits are to be placed, the importance of the participation of the scientific community and society in the process of radioactive waste site selection and disposal, guidance for the application of the basic requirements of safety and radiation protection, the general safety aspects involved and the current actions for the disposal of radioactive waste in Brazil are highlighted. (author)

  5. A Local Realistic Reconciliation of the EPR Paradox

    Science.gov (United States)

    Sanctuary, Bryan

    2014-03-01

    The exact violation of Bell's Inequalities is obtained with a local realistic model for spin. The model treats one particle that comprises a quantum ensemble and simulates the EPR data one coincidence at a time as a product state. Such a spin is represented by operators σx , iσy ,σz in its body frame rather than the usual set of σX ,σY ,σZ in the laboratory frame. This model, assumed valid in the absence of a measuring probe, contains both quantum polarizations and coherences. Each carries half the EPR correlation, but only half can be measured using coincidence techniques. The model further predicts the filter angles that maximize the spin correlation in EPR experiments.

  6. How to estimate realistic energy savings in Energy Performance Certificates

    DEFF Research Database (Denmark)

    Wittchen, Kim Bjarne; Altmann, Nagmeh; Berecová, Monika

    Given the fact that most MS use fixed or other kinds of default values as boundary condition input for energy performance calculations, it is not surprising that the calculated energy performance differs from the measured energy consumption. As a consequence, the calculated energy savings due...... stationary calculation tools using monthly average values. The optimum solution for energy performance certificates and calculating realistic energy savings is to have two calculations. One calculation, using default values to calculate the label itself, and one with actual input parameters for calculating...... energy performance before and after implementing energy saving measures. Actual values though, may be difficult to identify, so there is a need to make adaptations to reality easy. Even if actual values are available, there are still issues that cause calculated energy savings to differ from the obtained...

  7. Research of shot noise based on realistic nano-MOSFETs

    Directory of Open Access Journals (Sweden)

    Xiaofei Jia

    2017-05-01

    Full Text Available Experimental measurements and simulation results have shown that the dominant noise source of current noise changes from thermal noise to shot noise with scaling of MOSFET, and shot noise were suppressed by Fermi and Coulomb interactions. In this paper, Shot noise test system is established, and experimental results proved that shot noise were suppressed, and the expressions of shot noise in realistic nano-MOSFETs are derived with considering Fermi effect, Coulomb interaction and the combination of the both co-existence, respectively. On this basis, the variation of shot noise with voltage, temperature and source-drain doping were researched. The results we obtained are consistent with those from experiments and the theoretically explanation is given. At the same time, the shot noise test system is suitable for traditional nanoscale electronic components; the shot noise model is suitable for nanoscale MOSFET.

  8. From Minimal to Realistic Supersymmetric SU(5) Grand Unification

    CERN Document Server

    Altarelli, Guido; Masina, I; Altarelli, Guido; Feruglio, Ferruccio; Masina, Isabella

    2000-01-01

    We construct and discuss a "realistic" example of SUSY SU(5) GUT model, with an additional U(1) flavour symmetry, that is not plagued by the need of large fine tunings, like those associated with doublet-triplet splitting in the minimal model, and that leads to an acceptable phenomenology. This includes coupling unification with a value of alpha_s(m_Z) in much better agreement with the data than in the minimal version, an acceptable hierarchical pattern for fermion masses and mixing angles, also including neutrino masses and mixings, and a proton decay rate compatible with present limits (but the discovery of proton decay should be within reach of the next generation of experiments). In the neutrino sector the preferred solution is one with nearly maximal mixing both for atmospheric and solar neutrinos.

  9. Magnetic exchange at realistic CoO/Ni interfaces

    KAUST Repository

    Grytsiuk, Sergii

    2012-07-30

    We study the CoO/Ni interface by first principles calculations. Because the lattice mismatch is large, a realistic description requires a huge supercell. We investigate two interface configurations: in interface 1 the coupling between the Ni and Co atoms is mediated by O, whereas in interface 2 the Ni and Co atoms are in direct contact. We find that the magnetization (including the orbital moment) in interface 1 has a similar value as in bulk Ni but opposite sign, while in interface 2 it grows by 164%. The obtained magnetic moments can be explained by the local atomic environments. In addition, we find effects of charge transfer between the interface atoms. The Co 3d local density of states of interface 2 exhibits surprisingly small deviations from the corresponding bulk result, although the first coordination sphere is no longer octahedral. © Springer-Verlag 2012.

  10. Breaking with fun, educational and realistic learning games

    DEFF Research Database (Denmark)

    Duus Henriksen, Thomas

    2009-01-01

    are commonly conceived as means for staging learning processes, and that thinking learning games so has an inhibiting effect in regard to creating learning processes. The paper draws upon a qualitative study of participants' experiences with ‘the EIS Simulation', which is a computer-based learning game......This paper addresses the game conceptions and values that learning games inherit from regular gaming, as well as how they affect the use and development of learning games. Its key points concern the issues of thinking learning games as fun, educative and realistic, which is how learning games...... for teaching change management and change implementation. The EIS is played in groups, who share the game on a computer, and played by making change decisions in order to implement an IT system in an organisation. In this study, alternative participatory incentives, means for creating learning processes...

  11. Realistic control considerations for electromagnetically levitated urban transit vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Billing, J R

    1976-04-01

    A discussion is given of realistic control considerations of suspension dynamics and vehicle/guideway interaction for electromagnetically-levitated urban transit vehicles in the context of revenue applications. The emphasis is on safety, reliability, and maintainability rather than performance. An example urban transit system is described, and the following considerations of dynamics and control are examined: stability, magnet force requirements, magnet airgap requirements, vehicle ride, and component failures. It is shown that it is a formidable problem to ensure suspension stability under all conditions; that operation on curves is a critical magnet and control system design case; that operation of the magnets in the non-linear regime is unavoidable and that component failures will be a major problem. However, good vehicle ride is to be expected. It is concluded that magnetic levitation suspension technology requires substantial development effort before it can be considered suitable for revenue operation.

  12. Hydrostatic Equilibria of Rotating Stars with Realistic Equation of State

    Science.gov (United States)

    Yasutake, Nobutoshi; Fujisawa, Kotaro; Okawa, Hirotada; Yamada, Shoichi

    Stars rotate generally, but it is a non-trivial issue to obtain hydrostatic equilibria for rapidly rotating stars theoretically, especially for baroclinic cases, in which the pressure depends not only on the density, but also on the temperature and compositions. It is clear that the stellar structures with realistic equation of state are the baroclinic cases, but there are not so many studies for such equilibria. In this study, we propose two methods to obtain hydrostatic equilibria considering rotation and baroclinicity, namely the weak-solution method and the strong-solution method. The former method is based on the variational principle, which is also applied to the calculation of the inhomogeneous phases, known as the pasta structures, in crust of neutron stars. We found this method might break the balance equation locally, then introduce the strong-solution method. Note that our method is formulated in the mass coordinate, and it is hence appropriated for the stellar evolution calculations.

  13. Using Concrete and Realistic Data in Evaluating Initial Visualization Designs

    DEFF Research Database (Denmark)

    Knudsen, Søren; Pedersen, Jeppe Gerner; Herdal, Thor

    2016-01-01

    We explore means of designing and evaluating initial visualization ideas, with concrete and realistic data in cases where data is not readily available. Our approach is useful in exploring new domains and avenues for visualization, and contrasts other visualization work, which typically operate...... under the assumption that data has already been collected, and is ready to be visualized. We argue that it is sensible to understand data requirements and evaluate the potential value of visualization before devising means of automatic data collection. We base our exploration on three cases selected...... the design case and problem, the manner in which we collected data, and the findings obtained from evaluations. Afterwards, we describe four factors of our data collection approach, and discuss potential outcomes from it....

  14. Simulating realistic implementations of spin field effect transistor

    Science.gov (United States)

    Gao, Yunfei; Lundstrom, Mark S.; Nikonov, Dmitri E.

    2011-04-01

    The spin field effect transistor (spinFET), consisting of two ferromagnetic source/drain contacts and a Si channel, is predicted to have outstanding device and circuit performance. We carry out a rigorous numerical simulation of the spinFET based on the nonequilibrium Green's function formalism self-consistently coupled with a Poisson solver to produce the device I-V characteristics. Good agreement with the recent experiments in terms of spin injection, spin transport, and the magnetoresistance ratio (MR) is obtained. We include factors crucial for realistic devices: tunneling through a dielectric barrier, and spin relaxation at the interface and in the channel. Using these simulations, we suggest ways of optimizing the device. We propose that by choosing the right contact material and inserting tunnel oxide barriers between the source/drain and channel to filter different spins, the MR can be restored to ˜2000%, which would be beneficial to the reconfigurable logic circuit application.

  15. Is islet transplantation a realistic approach to curing diabetes?

    Science.gov (United States)

    Jin, Sang-Man; Kim, Kwang-Won

    2017-01-01

    Since the report of type 1 diabetes reversal in seven consecutive patients by the Edmonton protocol in 2000, pancreatic islet transplantation has been reappraised based on accumulated clinical evidence. Although initially expected to therapeutically target long-term insulin independence, islet transplantation is now indicated for more specific clinical benefits. With the long-awaited report of the first phase 3 clinical trial in 2016, allogeneic islet transplantation is now transitioning from an experimental to a proven therapy for type 1 diabetes with problematic hypoglycemia. Islet autotransplantation has already been therapeutically proven in chronic pancreatitis with severe abdominal pain refractory to conventional treatments, and it holds promise for preventing diabetes after partial pancreatectomy due to benign pancreatic tumors. Based on current evidence, this review focuses on islet transplantation as a realistic approach to treating diabetes.

  16. Realistic limitations of detecting planets around young active stars

    Directory of Open Access Journals (Sweden)

    Pinfield D.

    2013-04-01

    Full Text Available Current planet hunting methods using the radial velocity method are limited to observing middle-aged main-sequence stars where the signatures of stellar activity are much less than on young stars that have just arrived on the main-sequence. In this work we apply our knowledge from the surface imaging of these young stars to place realistic limitations on the possibility of detecting orbiting planets. In general we find that the magnitude of the stellar jitter is directly proportional to the stellar vsini. For G and K dwarfs, we find that it is possible, for models with high stellar activity and low stellar vsini, to be able to detect a 1 MJupiter mass planet within 50 epochs of observations and for the M dwarfs it is possible to detect a habitable zone Earth-like planet in 10s of observational epochs.

  17. Realistic electrostatic potentials in a neutron star crust

    International Nuclear Information System (INIS)

    Ebel, Claudio; Mishustin, Igor; Greiner, Walter

    2015-01-01

    We study the electrostatic properties of inhomogeneous nuclear matter which can be formed in the crusts of neutron stars or in supernova explosions. Such matter is represented by Wigner–Seitz cells of different geometries (spherical, cylindrical, cartesian), which contain nuclei, free neutrons and electrons under the conditions of electrical neutrality. Using the Thomas–Fermi approximation, we have solved the Poisson equation for the electrostatic potential and calculated the corresponding electron density distributions in individual cells. The calculations are done for different shapes and sizes of the cells and different average baryon densities. The electron-to-baryon fraction was fixed at 0.3. Using realistic electron distributions leads to a significant reduction in electrostatic energy and electron chemical potential. (paper)

  18. Electron distribution in polar heterojunctions within a realistic model

    Energy Technology Data Exchange (ETDEWEB)

    Tien, Nguyen Thanh, E-mail: thanhtienctu@gmail.com [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Thao, Dinh Nhu [Center for Theoretical and Computational Physics, College of Education, Hue University, 34 Le Loi Street, Hue City (Viet Nam); Thao, Pham Thi Bich [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Quang, Doan Nhat [Institute of Physics, Vietnamese Academy of Science and Technology, 10 Dao Tan Street, Hanoi (Viet Nam)

    2015-12-15

    We present a theoretical study of the electron distribution, i.e., two-dimensional electron gas (2DEG) in polar heterojunctions (HJs) within a realistic model. The 2DEG is confined along the growth direction by a triangular quantum well with a finite potential barrier and a bent band figured by all confinement sources. Therein, interface polarization charges take a double role: they induce a confining potential and, furthermore, they can make some change in other confinements, e.g., in the Hartree potential from ionized impurities and 2DEG. Confinement by positive interface polarization charges is necessary for the ground state of 2DEG existing at a high sheet density. The 2DEG bulk density is found to be increased in the barrier, so that the scattering occurring in this layer (from interface polarization charges and alloy disorder) becomes paramount in a polar modulation-doped HJ.

  19. Magnetic exchange at realistic CoO/Ni interfaces

    KAUST Repository

    Grytsyuk, Sergiy; Cossu, Fabrizio; Schwingenschlö gl, Udo

    2012-01-01

    We study the CoO/Ni interface by first principles calculations. Because the lattice mismatch is large, a realistic description requires a huge supercell. We investigate two interface configurations: in interface 1 the coupling between the Ni and Co atoms is mediated by O, whereas in interface 2 the Ni and Co atoms are in direct contact. We find that the magnetization (including the orbital moment) in interface 1 has a similar value as in bulk Ni but opposite sign, while in interface 2 it grows by 164%. The obtained magnetic moments can be explained by the local atomic environments. In addition, we find effects of charge transfer between the interface atoms. The Co 3d local density of states of interface 2 exhibits surprisingly small deviations from the corresponding bulk result, although the first coordination sphere is no longer octahedral. © Springer-Verlag 2012.

  20. Radioactive waste management in Brazil: a realistic view

    Energy Technology Data Exchange (ETDEWEB)

    Heilbron Filho, Paulo Fernando Lavalle; Perez Guerrero, Jesus Salvador, E-mail: paulo@cnen.gov.br, E-mail: jperez@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Xavier, Ana Maria, E-mail: axavier@cnen.gov.br [Comissao Nacional de Energia Nuclear (ESPOA/CNEN-RS), Porto Alegre, RS (Brazil)

    2014-07-01

    The objective of this article is to present a realistic view of the main issues related to the management of radioactive waste in Brazil as well as a comprehensive picture of the regulatory waste management status in the country and internationally. Technical aspects that must be considered to ensure a safe construction of near surface disposal facilities for radioactive waste of low and medium levels of radiation are addressed. Different types of deposits, the basic regulatory issues involving the licensing of these facilities, the development of a financial compensation model for the Brazilian Municipalities where deposits are to be placed, the importance of the participation of the scientific community and society in the process of radioactive waste site selection and disposal, guidance for the application of the basic requirements of safety and radiation protection, the general safety aspects involved and the current actions for the disposal of radioactive waste in Brazil are highlighted. (author)

  1. Modeling and Analysis of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.

    2015-01-01

    An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).

  2. Statistical properties of the nuclear shell-model Hamiltonian

    International Nuclear Information System (INIS)

    Dias, H.; Hussein, M.S.; Oliveira, N.A. de

    1986-01-01

    The statistical properties of realistic nuclear shell-model Hamiltonian are investigated in sd-shell nuclei. The probability distribution of the basic-vector amplitude is calculated and compared with the Porter-Thomas distribution. Relevance of the results to the calculation of the giant resonance mixing parameter is pointed out. (Author) [pt

  3. STATISTICAL LITERACY: EXCESSIVE REQUIREMENT OR TODAY'S NECESSITY

    Directory of Open Access Journals (Sweden)

    M. Potapova

    2014-04-01

    Full Text Available The purpose of this paper is to investigate the concept of literacy of population and the evolution of literacy according to the requirements of nowadays. The approaches of scientists to multifaceted literacy of population and its necessity are considered. Special attention is paid to statistical literacy of population and its necessity in the life of every modern person.

  4. [Comment on] Statistical discrimination

    Science.gov (United States)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  5. A Statistical Framework for Microbial Source Attribution

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P; Allen, J E; Cunningham, C T

    2009-04-28

    This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or &apos

  6. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  7. Convective aggregation in realistic convective-scale simulations

    Science.gov (United States)

    Holloway, Christopher E.

    2017-06-01

    To investigate the real-world relevance of idealized-model convective self-aggregation, five 15 day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibrium. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy shows that control runs have significant positive contributions to organization from radiation and negative contributions from surface fluxes and transport, similar to idealized runs once they become aggregated. Despite identical lateral boundary conditions for all experiments in each case, systematic differences in mean column water vapor (CWV), CWV distribution shape, and CWV autocorrelation length scale are found between the different sensitivity runs, particularly for those without interactive radiation, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations (although the organization of precipitation shows less sensitivity to interactive radiation). The magnitudes and signs of these systematic differences are consistent with a rough equilibrium between (1) equalization due to advection from the lateral boundaries and (2) disaggregation due to the absence of interactive radiation, implying disaggregation rates comparable to those in idealized runs with aggregated initial conditions and noninteractive radiation. This points to a plausible similarity in the way that radiation feedbacks maintain aggregated convection in both idealized simulations and the real world.Plain Language SummaryUnderstanding the processes that lead to the organization of tropical rainstorms is an important challenge for weather

  8. Effective realistic interactions for low momentum Hilbert spaces

    International Nuclear Information System (INIS)

    Weber, Dennis

    2012-01-01

    Realistic nucleon-nucleon potentials are an essential ingredient of modern microscopic many-body calculations. These potentials can be represented in two different ways: operator representation or matrix element representation. In operator representation the potential is represented by a set of quantum mechanical operators while in matrix element representation it is defined by the matrix elements in a given basis. Many modern potentials are constructed directly in matrix element representation. While the matrix element representation can be calculated from the operator representation, the determination of the operator representation from the matrix elements is more difficult. Some methods to solve the nuclear many-body problem, such as Fermionic Molecular Dynamics (FMD) or the Green's Function Monte Carlo (GFMC) method, however require explicitly the operator representation of the potential, as they do not work in a fixed many-body basis. It is therefore desirable to derive an operator representation also for the interactions given by matrix elements. In this work a method is presented which allows the derivation of an approximate operator representation starting from the momentum space partial wave matrix elements of the interaction. For that purpose an ansatz for the operator representation is chosen. The parameters in the ansatz are determined by a fit to the partial wave matrix elements. Since a perfect reproduction of the matrix elements in general cannot be achieved with a finite number of operators and the quality of the results depends on the choice of the ansatz, the obtained operator representation is tested in nuclear many-body calculations and the results are compared with those from the initial interaction matrix elements. For the calculation of the nucleon-nucleon scattering phase shifts and the deuteron properties a computer code written within this work is used. For larger nuclei the No Core Shell Model (NCSM) and FMD are applied. The described

  9. Toward developing more realistic groundwater models using big data

    Science.gov (United States)

    Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.

    2017-12-01

    Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage

  10. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  11. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  12. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  13. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  14. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  15. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  16. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  17. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  18. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  19. Implementing enhanced recovery pathways: a literature review with realist synthesis.

    Science.gov (United States)

    Coxon, Astrid; Nielsen, Karina; Cross, Jane; Fox, Chris

    2017-10-01

    Enhanced Recovery Pathways (ERPs) are an increasingly popular, evidenced-based approach to surgery, designed to improve patient outcomes and reduce costs. Despite evidence demonstrating the benefits of these pathways, implementation and adherence have been inconsistent. Using realist synthesis, this review explored the current literature surrounding the implementation of ERPs in the UK. Knowledge consolidation between authors and consulting with field experts helped to guide the search strategy. Relevant medical and social science databases were searched from 2000 to 2016, as well as a general web search. A total of 17 papers were identified, including original research, reviews, case studies and guideline documents. Full texts were analysed, cross-examined, and data extracted and synthesised. Several implementation strategies were identified, including the contexts in which these operated, the subsequent mechanisms of action that were triggered, and the outcome patterns they produced. Context-Mechanism-Outcome (CMO) configurations were generated, tested, and refined. These were grouped to develop two programme theories concerning ERP implementation, one related to the strategy of consulting with staff, the other with appointing a change agent to coordinate and drive the implementation process. These theories highlight instances in which implementation could be improved. Current literature in ERP research is primarily focussed on measuring patient outcomes and cost effectiveness, and as a result, important detail regarding the implementation process is often not reported or described robustly. This review not only provides recommendations for future improvements in ERP implementation, but also highlights specific areas of focus for furthering ERP implementation research.

  20. A Data-Driven Approach to Realistic Shape Morphing

    KAUST Repository

    Gao, Lin; Lai, Yu-Kun; Huang, Qi-Xing; Hu, Shi-Min

    2013-01-01

    Morphing between 3D objects is a fundamental technique in computer graphics. Traditional methods of shape morphing focus on establishing meaningful correspondences and finding smooth interpolation between shapes. Such methods however only take geometric information as input and thus cannot in general avoid producing unnatural interpolation, in particular for large-scale deformations. This paper proposes a novel data-driven approach for shape morphing. Given a database with various models belonging to the same category, we treat them as data samples in the plausible deformation space. These models are then clustered to form local shape spaces of plausible deformations. We use a simple metric to reasonably represent the closeness between pairs of models. Given source and target models, the morphing problem is casted as a global optimization problem of finding a minimal distance path within the local shape spaces connecting these models. Under the guidance of intermediate models in the path, an extended as-rigid-as-possible interpolation is used to produce the final morphing. By exploiting the knowledge of plausible models, our approach produces realistic morphing for challenging cases as demonstrated by various examples in the paper. © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  1. Convective aggregation in idealised models and realistic equatorial cases

    Science.gov (United States)

    Holloway, Chris

    2015-04-01

    Idealised explicit convection simulations of the Met Office Unified Model are shown to exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen previously in other models in several recent studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapour (CWV) field. To investigate the relevance of this behaviour to the real world, these idealized simulations are compared with five 15-day cases of real organized convection in the tropics, including multiple simulations of each case testing sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. Despite similar large-scale forcing via lateral boundary conditions, systematic differences in mean CWV, CWV distribution shape, and the length scale of CWV features are found between the different sensitivity runs, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations.

  2. Nonstandard Farey sequences in a realistic diode map

    International Nuclear Information System (INIS)

    Perez, G.; Sinha, S.; Cerdeira, H.

    1991-06-01

    We study a realistic coupled map system, modelling a p - i - n diode structure. As we vary the parameter corresponding to the (scaled) external potential in the model, the dynamics goes through a flip bifurcation and then a Hopf bifurcation, and as the parameter is increased further, we find evidence of a sequence of mode locked windows embedded in the quasiperiodic motion, with periodic attractors whose winding numbers p = p/q, are given by a Farey series. The interesting thing about this Farey sequence is that it is generated between two parent attractors with p = 2/7 and 2/8, where 2/8 implies two distinct coexisting attractors with p = 1/4, and the correct series is obtained only when we use parent winding number 2/8 and not 1/4. So unlike a regular Farey tree, p and q need not be relatively prime here, p = 2 x p/2 x q is permissible, where such attractors are actually comprised of two coexisting attractors with p = p/q. We also checked that the positions and widths of these windows exhibit well defined power law scaling. When the potential is increased further, the Farey windows still provide a ''skeleton'' for the dynamics, and within each window there is a host of other interesting dynamical features, including multiple forward and reverse Feigenbaum trees. (author). 15 refs, 7 figs

  3. Modelisation of synchrotron radiation losses in realistic tokamak plasmas

    International Nuclear Information System (INIS)

    Albajar, F.; Johner, J.; Granata, G.

    2000-08-01

    Synchrotron radiation losses become significant in the power balance of high-temperature plasmas envisaged for next step tokamaks. Due to the complexity of the exact calculation, these losses are usually roughly estimated with expressions derived from a plasma description using simplifying assumptions on the geometry, radiation absorption, and density and temperature profiles. In the present article, the complete formulation of the transport of synchrotron radiation is performed for realistic conditions of toroidal plasma geometry with elongated cross-section, using an exact method for the calculation of the absorption coefficient, and for arbitrary shapes of density and temperature profiles. The effects of toroidicity and temperature profile on synchrotron radiation losses are analyzed in detail. In particular, when the electron temperature profile is almost flat in the plasma center, as for example in ITB confinement regimes, synchrotron losses are found to be much stronger than in the case where the profile is represented by its best generalized parabolic approximation, though both cases give approximately the same thermal energy contents. Such an effect is not included in present approximate expressions. Finally, we propose a seven-variable fit for the fast calculation of synchrotron radiation losses. This fit is derived from a large database, which has been generated using a code implementing the complete formulation and optimized for massively parallel computing. (author)

  4. From Delivery to Adoption of Physical Activity Guidelines: Realist Synthesis

    Directory of Open Access Journals (Sweden)

    Liliana Leone

    2017-10-01

    Full Text Available Background: Evidence-based guidelines published by health authorities for the promotion of health-enhancing physical activity (PA, continue to be implemented unsuccessfully and demonstrate a gap between evidence and policies. This review synthesizes evidence on factors influencing delivery, adoption and implementation of PA promotion guidelines within different policy sectors (e.g., health, transport, urban planning, sport, education. Methods: Published literature was initially searched using PubMed, EBSCO, Google Scholar and continued through an iterative snowball technique. The literature review spanned the period 2002–2017. The realist synthesis approach was adopted to review the content of 39 included studies. An initial programme theory with a four-step chain from evidence emersion to implementation of guidelines was tested. Results: The synthesis furthers our understanding of the link between PA guidelines delivery and the actions of professionals responsible for implementation within health services, school departments and municipalities. The main mechanisms identified for guidance implementation were scientific legitimation, enforcement, feasibility, familiarity with concepts and PA habits. Threats emerged to the successful implementation of PA guidelines at national/local jurisdictional levels. Conclusions: The way PA guidelines are developed may influence their adoption by policy-makers and professionals. Useful lessons emerged that may inform synergies between policymaking and professional practices, promoting win-win multisectoral strategies.

  5. Improved transcranial magnetic stimulation coil design with realistic head modeling

    Science.gov (United States)

    Crowther, Lawrence; Hadimani, Ravi; Jiles, David

    2013-03-01

    We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.

  6. Factors influencing intercultural doctor-patient communication: a realist review.

    Science.gov (United States)

    Paternotte, Emma; van Dulmen, Sandra; van der Lee, Nadine; Scherpbier, Albert J J A; Scheele, Fedde

    2015-04-01

    Due to migration, doctors see patients from different ethnic backgrounds. This causes challenges for the communication. To develop training programs for doctors in intercultural communication (ICC), it is important to know which barriers and facilitators determine the quality of ICC. This study aimed to provide an overview of the literature and to explore how ICC works. A systematic search was performed to find literature published before October 2012. The search terms used were cultural, communication, healthcare worker. A realist synthesis allowed us to use an explanatory focus to understand the interplay of communication. In total, 145 articles met the inclusion criteria. We found ICC challenges due to language, cultural and social differences, and doctors' assumptions. The mechanisms were described as factors influencing the process of ICC and divided into objectives, core skills and specific skills. The results were synthesized in a framework for the development of training. The quality of ICC is influenced by the context and by the mechanisms. These mechanisms translate into practical points for training, which seem to have similarities with patient-centered communication. Training for improving ICC can be developed as an extension of the existing training for patient-centered communication. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Use of clinical guidelines in remote Australia: A realist evaluation.

    Science.gov (United States)

    Reddy, Sandeep; Orpin, Victoria; Herring, Sally; Mackie-Schneider, Stephanie; Struber, Janet

    2018-02-01

    The aim of this evaluation was to assess the acceptability, accessibility, and compliance with the 2014 editions of the Remote Primary Health Care Manuals (RPHCM) in health care centres across remote areas of Northern and Central Australia. To undertake a comprehensive evaluation that considered context, the evaluation used a realist evaluation framework. The evaluation used a variety of methods including interviews and survey to develop and test a programme theory. Many remote health practitioners have adopted standardized, evidence-based practice because of the use of the RPHCM. The mechanisms that led to the use of the manuals include acceptance of the worth of the protocols to their clinical practice, reliance on manual content to guide their practice, the perception of credibility, the applicability of RPHCM content to the context, and a fear of the consequences of not using the RPHCMs. Some remote health practitioners are less inclined to use the RPHCM regularly because of a perception that the content is less suited to their needs and daily practice or it is hard to navigate or understand. The evaluation concluded that there is work to be done to widen the RPHCM user base, and organizations need to increase support for their staff to use the RPHCM protocols better. These measures are expected to enable standardized clinical practice in the remote context. © 2017 John Wiley & Sons, Ltd.

  8. Atomistic simulations of graphite etching at realistic time scales.

    Science.gov (United States)

    Aussems, D U B; Bal, K M; Morgan, T W; van de Sanden, M C M; Neyts, E C

    2017-10-01

    Hydrogen-graphite interactions are relevant to a wide variety of applications, ranging from astrophysics to fusion devices and nano-electronics. In order to shed light on these interactions, atomistic simulation using Molecular Dynamics (MD) has been shown to be an invaluable tool. It suffers, however, from severe time-scale limitations. In this work we apply the recently developed Collective Variable-Driven Hyperdynamics (CVHD) method to hydrogen etching of graphite for varying inter-impact times up to a realistic value of 1 ms, which corresponds to a flux of ∼10 20 m -2 s -1 . The results show that the erosion yield, hydrogen surface coverage and species distribution are significantly affected by the time between impacts. This can be explained by the higher probability of C-C bond breaking due to the prolonged exposure to thermal stress and the subsequent transition from ion- to thermal-induced etching. This latter regime of thermal-induced etching - chemical erosion - is here accessed for the first time using atomistic simulations. In conclusion, this study demonstrates that accounting for long time-scales significantly affects ion bombardment simulations and should not be neglected in a wide range of conditions, in contrast to what is typically assumed.

  9. The construction of ``realistic'' four-dimensional strings through orbifolds

    Science.gov (United States)

    Font, A.; Ibáñez, L. E.; Quevedo, F.; Sierra, A.

    1990-02-01

    We discuss the construction of "realistic" lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3) × SU(2) × U(1) models as well as models incorporating a left-right SU(2) L × SU(2) R × U(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 × Z 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 × Z 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory.

  10. The construction of 'realistic' four-dimensional strings through orbifolds

    International Nuclear Information System (INIS)

    Font, A.; Quevedo, F.; Sierra, A.

    1990-01-01

    We discuss the construction of 'realistic' lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3)xSU(2)xU(1) models as well as models incorporating a left-right SU(2) L xSU(2) R xU(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 xZ 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 xZ 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory. (orig.)

  11. Management of long term sickness absence: a systematic realist review.

    Science.gov (United States)

    Higgins, Angela; O'Halloran, Peter; Porter, Sam

    2012-09-01

    The increasing impact and costs of long term sickness absence have been well documented. However, the diversity and complexity of interventions and of the contexts in which these take place makes a traditional review problematic. Therefore, we undertook a systematic realist review to identify the dominant programme theories underlying best practice, to assess the evidence for these theories, and to throw light on important enabling or disabling contextual factors. A search of the scholarly literature from 1950 to 2011 identified 5,576 articles, of which 269 formed the basis of the review. We found that the dominant programme theories in relation to effective management related to: early intervention or referral by employers; having proactive organisational procedures; good communication and cooperation between stakeholders; and workplace-based occupational rehabilitation. Significant contextual factors were identified as the level of support for interventions from top management, the size and structure of the organisation, the level of financial and organisational investment in the management of long-term sickness absence, and the quality of relationships between managers and staff. Consequently, those with responsibility for managing absence should bear in mind the contextual factors that are likely to have an impact on interventions, and do what they can to ensure stakeholders have at least a mutual understanding (if not a common purpose) in relation to their perceptions of interventions, goals, culture and practice in the management of long term sickness absence.

  12. Nucleon decay in a realistic SO(10) SUSY GUT

    International Nuclear Information System (INIS)

    Lucas, V.; Raby, S.

    1997-01-01

    In this paper, we calculate neutron and proton decay rates and branching ratios in a predictive SO(10) SUSY GUT which agrees well with low energy data. We show that the nucleon lifetimes are consistent with the experimental bounds. The nucleon decay rates are calculated using all one-loop chargino and gluino-dressed diagrams regardless of their chiral structure. We show that the four-fermion operator C jk (u R d jR )(d kL ν τL ), commonly neglected in previous nucleon decay calculations, not only contributes significantly to nucleon decay, but, for many values of the initial GUT parameters and for large tanβ, actually dominates the decay rate. As a consequence, we find that τ p /τ n is often substantially larger than the prediction obtained in small tanβ models. We also find that gluino-dressed diagrams, often neglected in nucleon decay calculations, contribute significantly to nucleon decay. In addition we find that the branching ratios obtained from this realistic SO(10) SUSY GUT differ significantly from the predictions obtained from open-quotes genericclose quotes SU(5) SUSY GUT close-quote s. Thus, nucleon decay branching ratios, when observed, can be used to test theories of fermion masses. copyright 1997 The American Physical Society

  13. Uncovering gender discrimination cues in a realistic setting.

    Science.gov (United States)

    Dupuis-Roy, Nicolas; Fortin, Isabelle; Fiset, Daniel; Gosselin, Frédéric

    2009-02-10

    Which face cues do we use for gender discrimination? Few studies have tried to answer this question and the few that have tried typically used only a small set of grayscale stimuli, often distorted and presented a large number of times. Here, we reassessed the importance of facial cues for gender discrimination in a more realistic setting. We applied Bubbles-a technique that minimizes bias toward specific facial features and does not necessitate the distortion of stimuli-to a set of 300 color photographs of Caucasian faces, each presented only once to 30 participants. Results show that the region of the eyes and the eyebrows-probably in the light-dark channel-is the most important facial cue for accurate gender discrimination; and that the mouth region is driving fast correct responses (but not fast incorrect responses)-the gender discrimination information in the mouth region is concentrated in the red-green color channel. Together, these results suggest that, when color is informative in the mouth region, humans use it and respond rapidly; and, when it's not informative, they have to rely on the more robust but more sluggish luminance information in the eye-eyebrow region.

  14. Linear perspective limitations on virtual reality and realistic displays

    Science.gov (United States)

    Temme, Leonard A.

    2007-04-01

    The visual images of the natural world, with their immediate intuitive appeal, seem like the logical gold standard for evaluating displays. After all, since photorealistic displays look so increasingly like the real world, what could be better? Part of the shortcoming of this intuitive appeal for displays is its naivete. Realism itself is full of potential illusions that we do not notice because, most of the time, realism is good enough for our everyday tasks. But when confronted with tasks that go beyond those for which our visual system has evolved, we may be blindsided. If we survive, blind to our erroneous perceptions and oblivious to our good fortune at having survived, we will not be any wiser next time. Realist displays depend on linear perspective (LP), the mathematical mapping of three dimensions onto two. Despite the fact that LP is a seductively elegant system that predicts results with defined mathematical procedures, artists do not stick to the procedures, not because they are math-phobic but because LP procedures, if followed explicitly, produce ugly, limited, and distorted images. If artists bother with formal LP procedures at all, they invariably temper the renderings by eye. The present paper discusses LP assumptions, limitations, and distortions. It provides examples of kluges to cover some of these LP shortcomings. It is important to consider the limitations of LP so that we do not let either naive assumptions or the seductive power of LP guide our thinking or expectations unrealistically as we consider its possible uses in advanced visual displays.

  15. Towards realistic string vacua from branes at singularities

    Science.gov (United States)

    Conlon, Joseph P.; Maharana, Anshuman; Quevedo, Fernando

    2009-05-01

    We report on progress towards constructing string models incorporating both realistic D-brane matter content and moduli stabilisation with dynamical low-scale supersymmetry breaking. The general framework is that of local D-brane models embedded into the LARGE volume approach to moduli stabilisation. We review quiver theories on del Pezzo n (dPn) singularities including both D3 and D7 branes. We provide supersymmetric examples with three quark/lepton families and the gauge symmetries of the Standard, Left-Right Symmetric, Pati-Salam and Trinification models, without unwanted chiral exotics. We describe how the singularity structure leads to family symmetries governing the Yukawa couplings which may give mass hierarchies among the different generations. We outline how these models can be embedded into compact Calabi-Yau compactifications with LARGE volume moduli stabilisation, and state the minimal conditions for this to be possible. We study the general structure of soft supersymmetry breaking. At the singularity all leading order contributions to the soft terms (both gravity- and anomaly-mediation) vanish. We enumerate subleading contributions and estimate their magnitude. We also describe model-independent physical implications of this scenario. These include the masses of anomalous and non-anomalous U(1)'s and the generic existence of a new hyperweak force under which leptons and/or quarks could be charged. We propose that such a gauge boson could be responsible for the ghost muon anomaly recently found at the Tevatron's CDF detector.

  16. A Data-Driven Approach to Realistic Shape Morphing

    KAUST Repository

    Gao, Lin

    2013-05-01

    Morphing between 3D objects is a fundamental technique in computer graphics. Traditional methods of shape morphing focus on establishing meaningful correspondences and finding smooth interpolation between shapes. Such methods however only take geometric information as input and thus cannot in general avoid producing unnatural interpolation, in particular for large-scale deformations. This paper proposes a novel data-driven approach for shape morphing. Given a database with various models belonging to the same category, we treat them as data samples in the plausible deformation space. These models are then clustered to form local shape spaces of plausible deformations. We use a simple metric to reasonably represent the closeness between pairs of models. Given source and target models, the morphing problem is casted as a global optimization problem of finding a minimal distance path within the local shape spaces connecting these models. Under the guidance of intermediate models in the path, an extended as-rigid-as-possible interpolation is used to produce the final morphing. By exploiting the knowledge of plausible models, our approach produces realistic morphing for challenging cases as demonstrated by various examples in the paper. © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  17. A realistic 3+1D Viscous Hydro Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Romatschke, Paul [Univ. of Colorado, Boulder, CO (United States)

    2015-05-31

    DoE funds were used as bridge funds for the faculty position for the PI at the University of Colorado. The total funds for the Years 3-5 of the JET Topical Collaboration amounted to about 50 percent of the academic year salary of the PI.The PI contributed to the JET Topical Collaboration by developing, testing and applying algorithms for a realistic simulation of the bulk medium created in relativistic ion collisions.Specifically, two approaches were studied, one based on a new Lattice-Boltzmann (LB) framework, and one on a more traditional viscous hydro-dynamics framework. Both approaches were found to be viable in principle, with the LB approach being more elegant but needing still more time to develop.The traditional approach led to the super-hybrid model of ion collisions dubbed 'superSONIC', and has been successfully used for phenomenology of relativistic heavy-ion and light-on-heavy-ion collisions.In the time-frame of the JET Topical Collaboration, the Colorado group has published 15 articles in peer-reviewed journals, three of which were published in Physical Review Letters. The group graduated one Master student during this time-frame and two more PhD students are expected to graduate in the next few years. The PI has given more than 28 talks and presentations during this period.

  18. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  19. OFDM versus Single Carrier: A Realistic Multi-Antenna Comparison

    Directory of Open Access Journals (Sweden)

    Moonen Marc

    2004-01-01

    Full Text Available There is an ongoing discussion in the broadband wireless world about the respective benefits of orthogonal frequency division multiplexing (OFDM and single carrier with frequency domain equalization (SC-FD. SC-FD allows for more relaxed front-end requirements, of which the power amplifier efficiency is very important for battery-driven terminals. OFDM, on the other hand, can yield improved BER performance at low complexity. Both schemes have extensions to multiple antennas to enhance the spectral efficiency and/or the link reliability. Moreover, both schemes have nonlinear versions using decision feedback equalization (DFE to further improve performance of the linear equalizers. In this paper, we compare these high-performance OFDM and SC-FD schemes using multiple antennas and DFE, while also accounting for the power amplifier efficiency. To make a realistic comparison, we also consider most important digital imperfections, such as channel and noise estimation, transmit and receive filtering, clipping and quantization, as well as link layer impact. Our analysis shows that for frequency-selective channels the relative performance impact of the power amplifier is negligible compared to the frequency diversity impact. The higher frequency diversity exploitation of SC-FD allows it to outperform OFDM in most cases. Therefore, SC-FD is a suitable candidate for broadband wireless communication.

  20. Asymmetric beams and CMB statistical anisotropy

    International Nuclear Information System (INIS)

    Hanson, Duncan; Lewis, Antony; Challinor, Anthony

    2010-01-01

    Beam asymmetries result in statistically anisotropic cosmic microwave background (CMB) maps. Typically, they are studied for their effects on the CMB power spectrum, however they more closely mimic anisotropic effects such as gravitational lensing and primordial power asymmetry. We discuss tools for studying the effects of beam asymmetry on general quadratic estimators of anisotropy, analytically for full-sky observations as well as in the analysis of realistic data. We demonstrate this methodology in application to a recently detected 9σ quadrupolar modulation effect in the WMAP data, showing that beams provide a complete and sufficient explanation for the anomaly.

  1. Realistic retrospective dose assessments to members of the public around Spanish nuclear facilities

    International Nuclear Information System (INIS)

    Jimenez, M.A.; Martin-Valdepenas, J.M.; Garcia-Talavera, M.; Martin-Matarranz, J.L.; Salas, M.R.; Serrano, J.I.; Ramos, L.M.

    2011-01-01

    In the frame of an epidemiological study carried out in the influence areas around the Spanish nuclear facilities (ISCIII-CSN, 2009. Epidemiological Study of The Possible Effect of Ionizing Radiations Deriving from The Operation of Spanish Nuclear Fuel Cycle Facilities on The Health of The Population Living in Their Vicinity. Final report December 2009. Ministerio de Ciencia e Innovacion, Instituto de Salud Carlos III, Consejo de Seguridad Nuclear. Madrid. Available from: (http://www.csn.es/images/stories/actualidad_datos/especiales/epidemiologico/epidemiological_study.pdf)), annual effective doses to public have been assessed by the Spanish Nuclear Safety Council (CSN) for over 45 years using a retrospective realistic-dose methodology. These values are compared with data from natural radiation exposure. For the affected population, natural radiation effective doses are in average 2300 times higher than effective doses due to the operation of nuclear installations (nuclear power stations and fuel cycle facilities). When considering the impact on the whole Spanish population, effective doses attributable to nuclear facilities represent in average 3.5 x 10 -5 mSv/y, in contrast to 1.6 mSv/y from natural radiation or 1.3 mSv/y from medical exposures. - Highlights: → Most comprehensive dose assessment to public by nuclear facilities ever done in Spain. → Dose to public is dominated by liquid effluent pathways for the power stations. → Dose to public is dominated by Rn inhalation for milling and mining facilities. → Average annual doses to public in influence areas are negligible (10 μSv/y or less). → Doses from facilities average 3.5 x 10 -2 μSv/y per person onto whole Spanish population.

  2. Realistic retrospective dose assessments to members of the public around Spanish nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, M.A., E-mail: majg@csn.es [Consejo de Seguridad Nuclear (CSN), Pedro Justo Dorado Dellmans 11, E-28040 Madrid (Spain); Martin-Valdepenas, J.M.; Garcia-Talavera, M.; Martin-Matarranz, J.L.; Salas, M.R.; Serrano, J.I.; Ramos, L.M. [Consejo de Seguridad Nuclear (CSN), Pedro Justo Dorado Dellmans 11, E-28040 Madrid (Spain)

    2011-11-15

    In the frame of an epidemiological study carried out in the influence areas around the Spanish nuclear facilities (ISCIII-CSN, 2009. Epidemiological Study of The Possible Effect of Ionizing Radiations Deriving from The Operation of Spanish Nuclear Fuel Cycle Facilities on The Health of The Population Living in Their Vicinity. Final report December 2009. Ministerio de Ciencia e Innovacion, Instituto de Salud Carlos III, Consejo de Seguridad Nuclear. Madrid. Available from: (http://www.csn.es/images/stories/actualidad{sub d}atos/especiales/epidemiologico/epidemiological{sub s}tudy.pdf)), annual effective doses to public have been assessed by the Spanish Nuclear Safety Council (CSN) for over 45 years using a retrospective realistic-dose methodology. These values are compared with data from natural radiation exposure. For the affected population, natural radiation effective doses are in average 2300 times higher than effective doses due to the operation of nuclear installations (nuclear power stations and fuel cycle facilities). When considering the impact on the whole Spanish population, effective doses attributable to nuclear facilities represent in average 3.5 x 10{sup -5} mSv/y, in contrast to 1.6 mSv/y from natural radiation or 1.3 mSv/y from medical exposures. - Highlights: > Most comprehensive dose assessment to public by nuclear facilities ever done in Spain. > Dose to public is dominated by liquid effluent pathways for the power stations. > Dose to public is dominated by Rn inhalation for milling and mining facilities. > Average annual doses to public in influence areas are negligible (10 {mu}Sv/y or less). > Doses from facilities average 3.5 x 10{sup -2} {mu}Sv/y per person onto whole Spanish population.

  3. Hierarchical statistical modeling of xylem vulnerability to cavitation.

    Science.gov (United States)

    Ogle, Kiona; Barber, Jarrett J; Willson, Cynthia; Thompson, Brenda

    2009-01-01

    Cavitation of xylem elements diminishes the water transport capacity of plants, and quantifying xylem vulnerability to cavitation is important to understanding plant function. Current approaches to analyzing hydraulic conductivity (K) data to infer vulnerability to cavitation suffer from problems such as the use of potentially unrealistic vulnerability curves, difficulty interpreting parameters in these curves, a statistical framework that ignores sampling design, and an overly simplistic view of uncertainty. This study illustrates how two common curves (exponential-sigmoid and Weibull) can be reparameterized in terms of meaningful parameters: maximum conductivity (k(sat)), water potential (-P) at which percentage loss of conductivity (PLC) =X% (P(X)), and the slope of the PLC curve at P(X) (S(X)), a 'sensitivity' index. We provide a hierarchical Bayesian method for fitting the reparameterized curves to K(H) data. We illustrate the method using data for roots and stems of two populations of Juniperus scopulorum and test for differences in k(sat), P(X), and S(X) between different groups. Two important results emerge from this study. First, the Weibull model is preferred because it produces biologically realistic estimates of PLC near P = 0 MPa. Second, stochastic embolisms contribute an important source of uncertainty that should be included in such analyses.

  4. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  5. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  6. Statistical Compression for Climate Model Output

    Science.gov (United States)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  7. Development of Realistic Safety Analysis Technology for CANDU Reactors

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Rhee, B. W.; Rho, G. H.

    2010-04-01

    The following 3 research items have been studied to develop and establish the realistic safety analysis and the associated technologies for a CANDU reactor. At the first, WIMS-CANDU which is physics cell code for a CANDU has been improved and validated against the physics criticality experiment data transferred through the international cooperation programs. Also an improved physics model to take into account the pressure tube creep was developed and utilized to assess the effects of the pressure tube creep of 0%, 2.5% and 5% diametral increase of pressure tube on core physics parameters. Secondly, the interfacing module between physics and thermal-hydraulics codes has been developed to provide the enhancement of reliability and convenience of the calculation results of the physics parameters such as power coefficient which was calculated by independent code systems. Finally, the important parameters related to the complex heat transfer mechanisms in the crept pressure tubes were identified to find how to improve the existing fuel channel models. One of the important parameters such as the oxidation model of Zr-steam reaction was identified, implemented and verified with the experimental data of the high pressure and temperature fuel channel and its model was utilized for CFD analysis of the crept pressure tube effect on the reactor safety. The results were also provided to validate the CATNENA models of the crept pressure tube and the effects of the pressure tube creep on the blowdown and post-blowdown phase during LOCA was assessed. The results of this study can be used to assess the uncertainty analysis of coolant void reactivity and the effects of the creep deformed pressure tubes on physics/TH/safety issues. Also, those results will be used to improve the current design and operational safety analysis codes, and to technically support the related issues to resolve their problems

  8. Magnetic resonance fingerprinting based on realistic vasculature in mice.

    Science.gov (United States)

    Pouliot, Philippe; Gagnon, Louis; Lam, Tina; Avti, Pramod K; Bowen, Chris; Desjardins, Michèle; Kakkar, Ashok K; Thorin, Eric; Sakadzic, Sava; Boas, David A; Lesage, Frédéric

    2017-04-01

    Magnetic resonance fingerprinting (MRF) was recently proposed as a novel strategy for MR data acquisition and analysis. A variant of MRF called vascular MRF (vMRF) followed, that extracted maps of three parameters of physiological importance: cerebral oxygen saturation (SatO 2 ), mean vessel radius and cerebral blood volume (CBV). However, this estimation was based on idealized 2-dimensional simulations of vascular networks using random cylinders and the empirical Bloch equations convolved with a diffusion kernel. Here we focus on studying the vascular MR fingerprint using real mouse angiograms and physiological values as the substrate for the MR simulations. The MR signal is calculated ab initio with a Monte Carlo approximation, by tracking the accumulated phase from a large number of protons diffusing within the angiogram. We first study the identifiability of parameters in simulations, showing that parameters are fully estimable at realistically high signal-to-noise ratios (SNR) when the same angiogram is used for dictionary generation and parameter estimation, but that large biases in the estimates persist when the angiograms are different. Despite these biases, simulations show that differences in parameters remain estimable. We then applied this methodology to data acquired using the GESFIDE sequence with SPIONs injected into 9 young wild type and 9 old atherosclerotic mice. Both the pre injection signal and the ratio of post-to-pre injection signals were modeled, using 5-dimensional dictionaries. The vMRF methodology extracted significant differences in SatO 2 , mean vessel radius and CBV between the two groups, consistent across brain regions and dictionaries. Further validation work is essential before vMRF can gain wider application. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Coil optimisation for transcranial magnetic stimulation in realistic head geometry.

    Science.gov (United States)

    Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J

    Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Realistic camera noise modeling with application to improved HDR synthesis

    Science.gov (United States)

    Goossens, Bart; Luong, Hiêp; Aelterman, Jan; Pižurica, Aleksandra; Philips, Wilfried

    2012-12-01

    Due to the ongoing miniaturization of digital camera sensors and the steady increase of the "number of megapixels", individual sensor elements of the camera become more sensitive to noise, even deteriorating the final image quality. To go around this problem, sophisticated processing algorithms in the devices, can help to maximally exploit the knowledge on the sensor characteristics (e.g., in terms of noise), and offer a better image reconstruction. Although a lot of research focuses on rather simplistic noise models, such as stationary additive white Gaussian noise, only limited attention has gone to more realistic digital camera noise models. In this article, we first present a digital camera noise model that takes several processing steps in the camera into account, such as sensor signal amplification, clipping, post-processing,.. We then apply this noise model to the reconstruction problem of high dynamic range (HDR) images from a small set of low dynamic range (LDR) exposures of a static scene. In literature, HDR reconstruction is mostly performed by computing a weighted average, in which the weights are directly related to the observer pixel intensities of the LDR image. In this work, we derive a Bayesian probabilistic formulation of a weighting function that is near-optimal in the MSE sense (or SNR sense) of the reconstructed HDR image, by assuming exponentially distributed irradiance values. We define the weighting function as the probability that the observed pixel intensity is approximately unbiased. The weighting function can be directly computed based on the noise model parameters, which gives rise to different symmetric and asymmetric shapes when electronic noise or photon noise is dominant. We also explain how to deal with the case that some of the noise model parameters are unknown and explain how the camera response function can be estimated using the presented noise model. Finally, experimental results are provided to support our findings.

  11. Achieving successful community engagement: a rapid realist review.

    Science.gov (United States)

    De Weger, E; Van Vooren, N; Luijkx, K G; Baan, C A; Drewes, H W

    2018-04-13

    Community engagement is increasingly seen as crucial to achieving high quality, efficient and collaborative care. However, organisations are still searching for the best and most effective ways to engage citizens in the shaping of health and care services. This review highlights the barriers and enablers for engaging communities in the planning, designing, governing, and/or delivering of health and care services on the macro or meso level. It provides policymakers and professionals with evidence-based guiding principles to implement their own effective community engagement (CE) strategies. A Rapid Realist Review was conducted to investigate how interventions interact with contexts and mechanisms to influence the effectiveness of CE. A local reference panel, consisting of health and care professionals and experts, assisted in the development of the research questions and search strategy. The panel's input helped to refine the review's findings. A systematic search of the peer-reviewed literature was conducted. Eight action-oriented guiding principles were identified: Ensure staff provide supportive and facilitative leadership to citizens based on transparency; foster a safe and trusting environment enabling citizens to provide input; ensure citizens' early involvement; share decision-making and governance control with citizens; acknowledge and address citizens' experiences of power imbalances between citizens and professionals; invest in citizens who feel they lack the skills and confidence to engage; create quick and tangible wins; take into account both citizens' and organisations' motivations. An especially important thread throughout the CE literature is the influence of power imbalances and organisations' willingness, or not, to address such imbalances. The literature suggests that 'meaningful participation' of citizens can only be achieved if organisational processes are adapted to ensure that they are inclusive, accessible and supportive of citizens.

  12. Realistic modelling of observed seismic motion in complex sedimentary basins

    International Nuclear Information System (INIS)

    Faeh, D.; Panza, G.F.

    1994-03-01

    Three applications of a numerical technique are illustrated to model realistically the seismic ground motion for complex two-dimensional structures. First we consider a sedimentary basin in the Friuli region, and we model strong motion records from an aftershock of the 1976 earthquake. Then we simulate the ground motion caused in Rome by the 1915, Fucino (Italy) earthquake, and we compare our modelling with the damage distribution observed in the town. Finally we deal with the interpretation of ground motion recorded in Mexico City, as a consequence of earthquakes in the Mexican subduction zone. The synthetic signals explain the major characteristics (relative amplitudes, spectral amplification, frequency content) of the considered seismograms, and the space distribution of the available macroseismic data. For the sedimentary basin in the Friuli area, parametric studies demonstrate the relevant sensitivity of the computed ground motion to small changes in the subsurface topography of the sedimentary basin, and in the velocity and quality factor of the sediments. The total energy of ground motion, determined from our numerical simulation in Rome, is in very good agreement with the distribution of damage observed during the Fucino earthquake. For epicentral distances in the range 50km-100km, the source location and not only the local soil conditions control the local effects. For Mexico City, the observed ground motion can be explained as resonance effects and as excitation of local surface waves, and the theoretical and the observed maximum spectral amplifications are very similar. In general, our numerical simulations permit the estimate of the maximum and average spectral amplification for specific sites, i.e. are a very powerful tool for accurate micro-zonation. (author). 38 refs, 19 figs, 1 tab

  13. Realistic phantoms to characterize dosimetry in pediatric CT

    Energy Technology Data Exchange (ETDEWEB)

    Carver, Diana E.; Kost, Susan D.; Fraser, Nicholas D.; Pickens, David R.; Price, Ronald R.; Stabin, Michael G. [Vanderbilt University Medical Center, Department of Radiology and Radiological Sciences, Nashville, TN (United States); Segars, W.P. [Duke University, Carl E. Ravin Advanced Imaging Laboratories, Durham, NC (United States)

    2017-05-15

    The estimation of organ doses and effective doses for children receiving CT examinations is of high interest. Newer, more realistic anthropomorphic body models can provide information on individual organ doses and improved estimates of effective dose. Previously developed body models representing 50th-percentile individuals at reference ages (newborn, 1, 5, 10 and 15 years) were modified to represent 10th, 25th, 75th and 90th height percentiles for both genders and an expanded range of ages (3, 8 and 13 years). We calculated doses for 80 pediatric reference phantoms from simulated chest-abdomen-pelvis exams on a model of a Philips Brilliance 64 CT scanner. Individual organ and effective doses were normalized to dose-length product (DLP) and fit as a function of body diameter. We calculated organ and effective doses for 80 reference phantoms and plotted them against body diameter. The data were well fit with an exponential function. We found DLP-normalized organ dose to correlate strongly with body diameter (R{sup 2}>0.95 for most organs). Similarly, we found a very strong correlation with body diameter for DLP-normalized effective dose (R{sup 2}>0.99). Our results were compared to other studies and we found average agreement of approximately 10%. We provide organ and effective doses for a total of 80 reference phantoms representing normal-stature children ranging in age and body size. This information will be valuable in replacing the types of vendor-reported doses available. These data will also permit the recording and tracking of individual patient doses. Moreover, this comprehensive dose database will facilitate patient matching and the ability to predict patient-individualized dose prior to examination. (orig.)

  14. Realistic Goals and Processes for Future Space Astronomy Portfolio Planning

    Science.gov (United States)

    Morse, Jon

    2015-08-01

    It is generally recognized that international participation and coordination is highly valuable for maximizing the scientific impact of modern space science facilities, as well as for cost-sharing reasons. Indeed, all large space science missions, and most medium and small missions, are international, even if one country or space agency has a clear leadership role and bears most of the development costs. International coordination is a necessary aspect of future mission planning, but how that coordination is done remains debatable. I propose that the community's scientific vision is generally homogeneous enough to permit international coordination of decadal-scale strategic science goals. However, the timing and budget allocation/funding mechanisms of individual countries and/or space agencies are too disparate for effective long-term strategic portfolio planning via a single international process. Rather, I argue that coordinated space mission portfolio planning is a natural consequence of international collaboration on individual strategic missions. I review the process and outcomes of the U.S. 2010 decadal survey in astronomy & astrophysics from the perspective of a government official who helped craft the survey charter and transmitted guidance to the scientific community on behalf of a sponsoring agency (NASA), while continuing to manage the current portfolio that involved ongoing negotiations with other space agencies. I analyze the difficulties associated with projecting long-term budgets, obtaining realistic mission costs (including the additional cost burdens of international partnerships), and developing new (possibly transformational) technologies. Finally, I remark on the future role that privately funded space science missions can have in accomplishing international science community goals.

  15. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  16. SEER Statistics | DCCPS/NCI/NIH

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  17. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  18. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  19. Protocol: a realist review of user fee exemption policies for health services in Africa.

    Science.gov (United States)

    Robert, Emilie; Ridde, Valéry; Marchal, Bruno; Fournier, Pierre

    2012-01-01

    Background Four years prior to the Millenium Development Goals (MDGs) deadline, low- and middle-income countries and international stakeholders are looking for evidence-based policies to improve access to healthcare for the most vulnerable populations. User fee exemption policies are one of the potential solutions. However, the evidence is disparate, and systematic reviews have failed to provide valuable lessons. The authors propose to produce an innovative synthesis of the available evidence on user fee exemption policies in Africa to feed the policy-making process. Methods The authors will carry out a realist review to answer the following research question: what are the outcomes of user fee exemption policies implemented in Africa? why do they produce such outcomes? and what contextual elements come into play? This type of review aims to understand how contextual elements influence the production of outcomes through the activation of specific mechanisms, in the form of context-mechanism-outcome configurations. The review will be conducted in five steps: (1) identifying with key stakeholders the mechanisms underlying user fee exemption policies to develop the analytical framework, (2) searching for and selecting primary data, (3) assessing the quality of evidence using the Mixed-Method Appraisal Tool, (4) extracting the data using the analytical framework and (5) synthesising the data in the form of context-mechanism-outcomes configurations. The output will be a middle-range theory specifying how user fee exemption policies work, for what populations and under what circumstances. Ethics and dissemination The two main target audiences are researchers who are looking for examples to implement a realist review, and policy-makers and international stakeholders looking for lessons learnt on user fee exemption. For the latter, a knowledge-sharing strategy involving local scientific and policy networks will be implemented. The study has been approved by the ethics

  20. Larval dispersal modeling of pearl oyster Pinctada margaritifera following realistic environmental and biological forcing in Ahe atoll lagoon.

    Directory of Open Access Journals (Sweden)

    Yoann Thomas

    Full Text Available Studying the larval dispersal of bottom-dwelling species is necessary to understand their population dynamics and optimize their management. The black-lip pearl oyster (Pinctada margaritifera is cultured extensively to produce black pearls, especially in French Polynesia's atoll lagoons. This aquaculture relies on spat collection, a process that can be optimized by understanding which factors influence larval dispersal. Here, we investigate the sensitivity of P. margaritifera larval dispersal kernel to both physical and biological factors in the lagoon of Ahe atoll. Specifically, using a validated 3D larval dispersal model, the variability of lagoon-scale connectivity is investigated against wind forcing, depth and location of larval release, destination location, vertical swimming behavior and pelagic larval duration (PLD factors. The potential connectivity was spatially weighted according to both the natural and cultivated broodstock densities to provide a realistic view of connectivity. We found that the mean pattern of potential connectivity was driven by the southwest and northeast main barotropic circulation structures, with high retention levels in both. Destination locations, spawning sites and PLD were the main drivers of potential connectivity, explaining respectively 26%, 59% and 5% of the variance. Differences between potential and realistic connectivity showed the significant contribution of the pearl oyster broodstock location to its own dynamics. Realistic connectivity showed larger larval supply in the western destination locations, which are preferentially used by farmers for spat collection. In addition, larval supply in the same sectors was enhanced during summer wind conditions. These results provide new cues to understanding the dynamics of bottom-dwelling populations in atoll lagoons, and show how to take advantage of numerical models for pearl oyster management.

  1. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  2. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  3. Optimizing Wind And Hydropower Generation Within Realistic Reservoir Operating Policy

    Science.gov (United States)

    Magee, T. M.; Clement, M. A.; Zagona, E. A.

    2012-12-01

    Previous studies have evaluated the benefits of utilizing the flexibility of hydropower systems to balance the variability and uncertainty of wind generation. However, previous hydropower and wind coordination studies have simplified non-power constraints on reservoir systems. For example, some studies have only included hydropower constraints on minimum and maximum storage volumes and minimum and maximum plant discharges. The methodology presented here utilizes the pre-emptive linear goal programming optimization solver in RiverWare to model hydropower operations with a set of prioritized policy constraints and objectives based on realistic policies that govern the operation of actual hydropower systems, including licensing constraints, environmental constraints, water management and power objectives. This approach accounts for the fact that not all policy constraints are of equal importance. For example target environmental flow levels may not be satisfied if it would require violating license minimum or maximum storages (pool elevations), but environmental flow constraints will be satisfied before optimizing power generation. Additionally, this work not only models the economic value of energy from the combined hydropower and wind system, it also captures the economic value of ancillary services provided by the hydropower resources. It is recognized that the increased variability and uncertainty inherent with increased wind penetration levels requires an increase in ancillary services. In regions with liberalized markets for ancillary services, a significant portion of hydropower revenue can result from providing ancillary services. Thus, ancillary services should be accounted for when determining the total value of a hydropower system integrated with wind generation. This research shows that the end value of integrated hydropower and wind generation is dependent on a number of factors that can vary by location. Wind factors include wind penetration level

  4. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  5. Applied statistics for civil and environmental engineers

    CERN Document Server

    Kottegoda, N T

    2009-01-01

    Civil and environmental engineers need an understanding of mathematical statistics and probability theory to deal with the variability that affects engineers'' structures, soil pressures, river flows and the like. Students, too, need to get to grips with these rather difficult concepts.This book, written by engineers for engineers, tackles the subject in a clear, up-to-date manner using a process-orientated approach. It introduces the subjects of mathematical statistics and probability theory, and then addresses model estimation and testing, regression and multivariate methods, analysis of extreme events, simulation techniques, risk and reliability, and economic decision making.325 examples and case studies from European and American practice are included and each chapter features realistic problems to be solved.For the second edition new sections have been added on Monte Carlo Markov chain modeling with details of practical Gibbs sampling, sensitivity analysis and aleatory and epistemic uncertainties, and co...

  6. Chemists, Access, Statistics

    Science.gov (United States)

    Holmes, Jon L.

    2000-06-01

    IP-number access. Current subscriptions can be upgraded to IP-number access at little additional cost. We are pleased to be able to offer to institutions and libraries this convenient mode of access to subscriber only resources at JCE Online. JCE Online Usage Statistics We are continually amazed by the activity at JCE Online. So far, the year 2000 has shown a marked increase. Given the phenomenal overall growth of the Internet, perhaps our surprise is not warranted. However, during the months of January and February 2000, over 38,000 visitors requested over 275,000 pages. This is a monthly increase of over 33% from the October-December 1999 levels. It is good to know that people are visiting, but we would very much like to know what you would most like to see at JCE Online. Please send your suggestions to JCEOnline@chem.wisc.edu. For those who are interested, JCE Online year-to-date statistics are available. Biographical Snapshots of Famous Chemists: Mission Statement Feature Editor: Barbara Burke Chemistry Department, California State Polytechnic University-Pomona, Pomona, CA 91768 phone: 909/869-3664 fax: 909/869-4616 email: baburke@csupomona.edu The primary goal of this JCE Internet column is to provide information about chemists who have made important contributions to chemistry. For each chemist, there is a short biographical "snapshot" that provides basic information about the person's chemical work, gender, ethnicity, and cultural background. Each snapshot includes links to related websites and to a biobibliographic database. The database provides references for the individual and can be searched through key words listed at the end of each snapshot. All students, not just science majors, need to understand science as it really is: an exciting, challenging, human, and creative way of learning about our natural world. Investigating the life experiences of chemists can provide a means for students to gain a more realistic view of chemistry. In addition students

  7. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  8. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  9. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  10. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  11. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  12. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  13. Tuberculosis Data and Statistics

    Science.gov (United States)

    ... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...

  14. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  15. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  16. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  17. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  18. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  19. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  20. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  1. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  2. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  3. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  4. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  5. NASA Orbital Debris Baseline Populations

    Science.gov (United States)

    Krisko, Paula H.; Vavrin, A. B.

    2013-01-01

    The NASA Orbital Debris Program Office has created high fidelity populations of the debris environment. The populations include objects of 1 cm and larger in Low Earth Orbit through Geosynchronous Transfer Orbit. They were designed for the purpose of assisting debris researchers and sensor developers in planning and testing. This environment is derived directly from the newest ORDEM model populations which include a background derived from LEGEND, as well as specific events such as the Chinese ASAT test, the Iridium 33/Cosmos 2251 accidental collision, the RORSAT sodium-potassium droplet releases, and other miscellaneous events. It is the most realistic ODPO debris population to date. In this paper we present the populations in chart form. We describe derivations of the background population and the specific populations added on. We validate our 1 cm and larger Low Earth Orbit population against SSN, Haystack, and HAX radar measurements.

  6. Analysis of Heterogeneous Networks with Dual Connectivity in a Realistic Urban Deployment

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Barcos, Sonia; Wang, Hua

    2015-01-01

    the performance in this realistic layout. Due to the uneven load distribution observed in realistic deployments, DC is able to provide fast load balancing gains also at relatively high load - and not only at low load as typically observed in 3GPP scenarios. For the same reason, the proposed cell selection...

  7. Realist review and synthesis of retention studies for health workers in rural and remote areas

    NARCIS (Netherlands)

    Dieleman, M.A.; Kane, Sumit; Zwanikken, Prisca A C; Gerretsen, Barend

    2011-01-01

    This report uses a realist review, which is a theory-based method, to address the questions of “why” and “how” certain rural retention interventions work better in some contexts and fail in others. Through applying a realist perspective to the review of these retention studies, a greater

  8. Neutron star models with realistic high-density equations of state

    International Nuclear Information System (INIS)

    Malone, R.C.; Johnson, M.B.; Bethe, H.A.

    1975-01-01

    We calculate neutron star models using four realistic high-density models of the equation of state. We conclude that the maximum mass of a neutron star is unlikely to exceed 2 M/sub sun/. All of the realistic models are consistent with current estimates of the moment of inertia of the Crab pulsar

  9. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  10. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  11. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  12. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  13. Collimator optimization in myocardial perfusion SPECT using the ideal observer and realistic background variability for lesion detection and joint detection and localization tasks

    Science.gov (United States)

    Ghaly, Michael; Du, Yong; Links, Jonathan M.; Frey, Eric C.

    2016-03-01

    In SPECT imaging, collimators are a major factor limiting image quality and largely determine the noise and resolution of SPECT images. In this paper, we seek the collimator with the optimal tradeoff between image noise and resolution with respect to performance on two tasks related to myocardial perfusion SPECT: perfusion defect detection and joint detection and localization. We used the Ideal Observer (IO) operating on realistic background-known-statistically (BKS) and signal-known-exactly (SKE) data. The areas under the receiver operating characteristic (ROC) and localization ROC (LROC) curves (AUCd, AUCd+l), respectively, were used as the figures of merit for both tasks. We used a previously developed population of 54 phantoms based on the eXtended Cardiac Torso Phantom (XCAT) that included variations in gender, body size, heart size and subcutaneous adipose tissue level. For each phantom, organ uptakes were varied randomly based on distributions observed in patient data. We simulated perfusion defects at six different locations with extents and severities of 10% and 25%, respectively, which represented challenging but clinically relevant defects. The extent and severity are, respectively, the perfusion defect’s fraction of the myocardial volume and reduction of uptake relative to the normal myocardium. Projection data were generated using an analytical projector that modeled attenuation, scatter, and collimator-detector response effects, a 9% energy resolution at 140 keV, and a 4 mm full-width at half maximum (FWHM) intrinsic spatial resolution. We investigated a family of eight parallel-hole collimators that spanned a large range of sensitivity-resolution tradeoffs. For each collimator and defect location, the IO test statistics were computed using a Markov Chain Monte Carlo (MCMC) method for an ensemble of 540 pairs of defect-present and -absent images that included the aforementioned anatomical and uptake variability. Sets of test statistics were

  14. Herd Immunity to Ebolaviruses Is Not a Realistic Target for Current Vaccination Strategies

    Directory of Open Access Journals (Sweden)

    Stuart G. Masterson

    2018-05-01

    Full Text Available The recent West African Ebola virus pandemic, which affected >28,000 individuals increased interest in anti-Ebolavirus vaccination programs. Here, we systematically analyzed the requirements for a prophylactic vaccination program based on the basic reproductive number (R0, i.e., the number of secondary cases that result from an individual infection. Published R0 values were determined by systematic literature research and ranged from 0.37 to 20. R0s ≥ 4 realistically reflected the critical early outbreak phases and superspreading events. Based on the R0, the herd immunity threshold (Ic was calculated using the equation Ic = 1 − (1/R0. The critical vaccination coverage (Vc needed to provide herd immunity was determined by including the vaccine effectiveness (E using the equation Vc = Ic/E. At an R0 of 4, the Ic is 75% and at an E of 90%, more than 80% of a population need to be vaccinated to establish herd immunity. Such vaccination rates are currently unrealistic because of resistance against vaccinations, financial/logistical challenges, and a lack of vaccines that provide long-term protection against all human-pathogenic Ebolaviruses. Hence, outbreak management will for the foreseeable future depend on surveillance and case isolation. Clinical vaccine candidates are only available for Ebola viruses. Their use will need to be focused on health-care workers, potentially in combination with ring vaccination approaches.

  15. Impacts of Realistic Urban Heating, Part I: Spatial Variability of Mean Flow, Turbulent Exchange and Pollutant Dispersion

    Science.gov (United States)

    Nazarian, Negin; Martilli, Alberto; Kleissl, Jan

    2018-03-01

    As urbanization progresses, more realistic methods are required to analyze the urban microclimate. However, given the complexity and computational cost of numerical models, the effects of realistic representations should be evaluated to identify the level of detail required for an accurate analysis. We consider the realistic representation of surface heating in an idealized three-dimensional urban configuration, and evaluate the spatial variability of flow statistics (mean flow and turbulent fluxes) in urban streets. Large-eddy simulations coupled with an urban energy balance model are employed, and the heating distribution of urban surfaces is parametrized using sets of horizontal and vertical Richardson numbers, characterizing thermal stratification and heating orientation with respect to the wind direction. For all studied conditions, the thermal field is strongly affected by the orientation of heating with respect to the airflow. The modification of airflow by the horizontal heating is also pronounced for strongly unstable conditions. The formation of the canyon vortices is affected by the three-dimensional heating distribution in both spanwise and streamwise street canyons, such that the secondary vortex is seen adjacent to the windward wall. For the dispersion field, however, the overall heating of urban surfaces, and more importantly, the vertical temperature gradient, dominate the distribution of concentration and the removal of pollutants from the building canyon. Accordingly, the spatial variability of concentration is not significantly affected by the detailed heating distribution. The analysis is extended to assess the effects of three-dimensional surface heating on turbulent transfer. Quadrant analysis reveals that the differential heating also affects the dominance of ejection and sweep events and the efficiency of turbulent transfer (exuberance) within the street canyon and at the roof level, while the vertical variation of these parameters is less

  16. Childhood disability population-based surveillance: Assessment of the Ages and Stages Questionnaire Third Edition and Washington Group on Disability Statistics/UNICEF module on child functioning in a rural setting in South Africa

    Directory of Open Access Journals (Sweden)

    Marieta Visser

    2016-09-01

    Conclusion: Since the WG/UNICEF module is quicker to administer, easier to understand and based on the ICF, it can be considered as an appropriate parent-reported measure for large-scale, population-based as well as smaller, community-specific contexts. It is, however, recommended that future research and development continues with the WG/UNICEF module to enhance its conceptual equivalence for larger-scale, population-based studies in South Africa and internationally.

  17. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    statement is true. In some cases statistical aspects of safety are misused, where the number of runs for several outputs is correct only for statistically independent outputs, or misinterpreted. We do not know the probability distribution of the output variables subjected to safety limitations. At the same time in some asymmetric distributions the 0.95/0.95 methodology simply fails: if we repeat the calculations in many cases we would get a value higher than the basic value, which means the limit violation in the calculation becomes more and more probable in the repeated analysis. Consequent application of order statistics or the application of the sign test may offer a way out of the present situation. The authors are also convinced that efforts should be made to study the statistics of the output variables, and to study the occurrence of chaos in the analyzed cases. All these observations should influence, in safety analysis, the application of best estimate methods, and underline the opinion that any realistic modeling and simulation of complex systems must include the probabilistic features of the system and the environment

  18. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  19. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  20. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra