WorldWideScience

Sample records for statistically realistic populations

  1. Realistic thermodynamic and statistical-mechanical measures for neural synchronization.

    Science.gov (United States)

    Kim, Sang-Yoon; Lim, Woochang

    2014-04-15

    Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.

  2. Quantifying introgression risk with realistic population genetics

    OpenAIRE

    Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy

    2012-01-01

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, rep...

  3. Quantifying introgression risk with realistic population genetics.

    Science.gov (United States)

    Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy

    2012-12-07

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.

  4. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    Science.gov (United States)

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  5. Statistical Models of Adaptive Immune populations

    Science.gov (United States)

    Sethna, Zachary; Callan, Curtis; Walczak, Aleksandra; Mora, Thierry

    The availability of large (104-106 sequences) datasets of B or T cell populations from a single individual allows reliable fitting of complex statistical models for naïve generation, somatic selection, and hypermutation. It is crucial to utilize a probabilistic/informational approach when modeling these populations. The inferred probability distributions allow for population characterization, calculation of probability distributions of various hidden variables (e.g. number of insertions), as well as statistical properties of the distribution itself (e.g. entropy). In particular, the differences between the T cell populations of embryonic and mature mice will be examined as a case study. Comparing these populations, as well as proposed mixed populations, provides a concrete exercise in model creation, comparison, choice, and validation.

  6. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  7. Photo-Realistic Statistical Skull Morphotypes: New Exemplars for Ancestry and Sex Estimation in Forensic Anthropology.

    Science.gov (United States)

    Caple, Jodi; Stephan, Carl N

    2017-05-01

    Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.

  8. Pseudo-populations a basic concept in statistical surveys

    CERN Document Server

    Quatember, Andreas

    2015-01-01

    This book emphasizes that artificial or pseudo-populations play an important role in statistical surveys from finite universes in two manners: firstly, the concept of pseudo-populations may substantially improve users’ understanding of various aspects in the sampling theory and survey methodology; an example of this scenario is the Horvitz-Thompson estimator. Secondly, statistical procedures exist in which pseudo-populations actually have to be generated. An example of such a scenario can be found in simulation studies in the field of survey sampling, where close-to-reality pseudo-populations are generated from known sample and population data to form the basis for the simulation process. The chapters focus on estimation methods, sampling techniques, nonresponse, questioning designs and statistical disclosure control.This book is a valuable reference in understanding the importance of the pseudo-population concept and applying it in teaching and research.

  9. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  10. Population dynamics at high Reynolds number

    NARCIS (Netherlands)

    Perlekar, P.; Benzi, R.; Nelson, D.R.; Toschi, F.

    2010-01-01

    We study the statistical properties of population dynamics evolving in a realistic two-dimensional compressible turbulent velocity field. We show that the interplay between turbulent dynamics and population growth and saturation leads to quasi-localization and a remarkable reduction in the carrying

  11. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  12. Statistical Processing Algorithms for Human Population Databases

    Directory of Open Access Journals (Sweden)

    Camelia COLESCU

    2012-01-01

    Full Text Available The article is describing some algoritms for statistic functions aplied to a human population database. The samples are specific for the most interesting periods, when the evolution of statistical datas has spectacolous value. The article describes the most usefull form of grafical prezentation of the results

  13. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  14. The Statistical Modeling of the Trends Concerning the Romanian Population

    Directory of Open Access Journals (Sweden)

    Gabriela OPAIT

    2014-11-01

    Full Text Available This paper reflects the statistical modeling concerning the resident population in Romania, respectively the total of the romanian population, through by means of the „Least Squares Method”. Any country it develops by increasing of the population, respectively of the workforce, which is a factor of influence for the growth of the Gross Domestic Product (G.D.P.. The „Least Squares Method” represents a statistical technique for to determine the trend line of the best fit concerning a model.

  15. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  16. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  17. Survey of Approaches to Generate Realistic Synthetic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.

  18. Population activity statistics dissect subthreshold and spiking variability in V1.

    Science.gov (United States)

    Bányai, Mihály; Koman, Zsombor; Orbán, Gergő

    2017-07-01

    Response variability, as measured by fluctuating responses upon repeated performance of trials, is a major component of neural responses, and its characterization is key to interpret high dimensional population recordings. Response variability and covariability display predictable changes upon changes in stimulus and cognitive or behavioral state, providing an opportunity to test the predictive power of models of neural variability. Still, there is little agreement on which model to use as a building block for population-level analyses, and models of variability are often treated as a subject of choice. We investigate two competing models, the doubly stochastic Poisson (DSP) model assuming stochasticity at spike generation, and the rectified Gaussian (RG) model tracing variability back to membrane potential variance, to analyze stimulus-dependent modulation of both single-neuron and pairwise response statistics. Using a pair of model neurons, we demonstrate that the two models predict similar single-cell statistics. However, DSP and RG models have contradicting predictions on the joint statistics of spiking responses. To test the models against data, we build a population model to simulate stimulus change-related modulations in pairwise response statistics. We use single-unit data from the primary visual cortex (V1) of monkeys to show that while model predictions for variance are qualitatively similar to experimental data, only the RG model's predictions are compatible with joint statistics. These results suggest that models using Poisson-like variability might fail to capture important properties of response statistics. We argue that membrane potential-level modeling of stochasticity provides an efficient strategy to model correlations. NEW & NOTEWORTHY Neural variability and covariability are puzzling aspects of cortical computations. For efficient decoding and prediction, models of information encoding in neural populations hinge on an appropriate model of

  19. Statistical properties of superimposed stationary spike trains.

    Science.gov (United States)

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  20. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  1. Data on education: from population statistics to epidemiological research

    DEFF Research Database (Denmark)

    Pallesen, Palle Bo; Tverborgvik, Torill; Rasmussen, Hanna Barbara

    2010-01-01

    BACKGROUND: Level of education is in many fields of research used as an indicator of social status. METHODS: Using Statistics Denmark's register for education and employment of the population, we examined highest completed education with a birth-cohort perspective focusing on people born between...... of population trends by use of extrapolated values, solutions are less obvious in epidemiological research using individual level data....

  2. Generalized Warburg impedance on realistic self-affine fractals ...

    Indian Academy of Sciences (India)

    Administrator

    Generalized Warburg impedance on realistic self-affine fractals: Comparative study of statistically corrugated and isotropic roughness. RAJESH KUMAR and RAMA KANT. Journal of Chemical Sciences, Vol. 121, No. 5, September 2009, pp. 579–588. 1. ( ) c. L. R ω on page 582, column 2, para 2, after eq (8) should read as ...

  3. An audit of the statistics and the comparison with the parameter in the population

    Science.gov (United States)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  4. Interferometric data modelling: issues in realistic data generation

    International Nuclear Information System (INIS)

    Mukherjee, Soma

    2004-01-01

    This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework

  5. Statistical dynamics of regional populations and economies

    Science.gov (United States)

    Huo, Jie; Wang, Xu-Ming; Hao, Rui; Wang, Peng

    Quantitative analysis of human behavior and social development is becoming a hot spot of some interdisciplinary studies. A statistical analysis on the population and GDP of 150 cities in China from 1990 to 2013 is conducted. The result indicates the cumulative probability distribution of the populations and that of the GDPs obeying the shifted power law, respectively. In order to understand these characteristics, a generalized Langevin equation describing variation of population is proposed, which is based on the correlations between population and GDP as well as the random fluctuations of the related factors. The equation is transformed into the Fokker-Plank equation to express the evolution of population distribution. The general solution demonstrates a transition of the distribution from the normal Gaussian distribution to a shifted power law, which suggests a critical point of time at which the transition takes place. The shifted power law distribution in the supercritical situation is qualitatively in accordance with the practical result. The distribution of the GDPs is derived from the well-known Cobb-Douglas production function. The result presents a change, in supercritical situation, from a shifted power law to the Gaussian distribution. This is a surprising result-the regional GDP distribution of our world will be the Gaussian distribution one day in the future. The discussions based on the changing trend of economic growth suggest it will be true. Therefore, these theoretical attempts may draw a historical picture of our society in the aspects of population and economy.

  6. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    Science.gov (United States)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  7. The aging population in the twenty-first century: statistics for health policy

    National Research Council Canada - National Science Library

    Gilford, Dorothy M

    1988-01-01

    ... on Statistics for an Aging Population Sam Shapiro, Chair Committee on National Statistics Commission on Behavioral and Social Sciences and Education National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1988 Copyrightoriginal retained, the be not from cannot book, paper original however, for version formatting, authoritative...

  8. Directions for new developments on statistical design and analysis of small population group trials.

    Science.gov (United States)

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small

  9. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis.

    Science.gov (United States)

    Ducrot, Virginie; Péry, Alexandre R R; Lagadic, Laurent

    2010-11-12

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment.

  10. MetAssimulo:Simulation of Realistic NMR Metabolic Profiles

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2010-10-01

    Full Text Available Abstract Background Probing the complex fusion of genetic and environmental interactions, metabolic profiling (or metabolomics/metabonomics, the study of small molecules involved in metabolic reactions, is a rapidly expanding 'omics' field. A major technique for capturing metabolite data is 1H-NMR spectroscopy and this yields highly complex profiles that require sophisticated statistical analysis methods. However, experimental data is difficult to control and expensive to obtain. Thus data simulation is a productive route to aid algorithm development. Results MetAssimulo is a MATLAB-based package that has been developed to simulate 1H-NMR spectra of complex mixtures such as metabolic profiles. Drawing data from a metabolite standard spectral database in conjunction with concentration information input by the user or constructed automatically from the Human Metabolome Database, MetAssimulo is able to create realistic metabolic profiles containing large numbers of metabolites with a range of user-defined properties. Current features include the simulation of two groups ('case' and 'control' specified by means and standard deviations of concentrations for each metabolite. The software enables addition of spectral noise with a realistic autocorrelation structure at user controllable levels. A crucial feature of the algorithm is its ability to simulate both intra- and inter-metabolite correlations, the analysis of which is fundamental to many techniques in the field. Further, MetAssimulo is able to simulate shifts in NMR peak positions that result from matrix effects such as pH differences which are often observed in metabolic NMR spectra and pose serious challenges for statistical algorithms. Conclusions No other software is currently able to simulate NMR metabolic profiles with such complexity and flexibility. This paper describes the algorithm behind MetAssimulo and demonstrates how it can be used to simulate realistic NMR metabolic profiles with

  11. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  12. Quantifying introgression risk with realistic population genetics

    NARCIS (Netherlands)

    Ghosh, A.; Meirmans, P.G.; Haccou, P.

    2012-01-01

    Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified

  13. Are Statisticians Cold-Blooded Bosses? A New Perspective on the "Old" Concept of Statistical Population

    Science.gov (United States)

    Lu, Yonggang; Henning, Kevin S. S.

    2013-01-01

    Spurred by recent writings regarding statistical pragmatism, we propose a simple, practical approach to introducing students to a new style of statistical thinking that models nature through the lens of data-generating processes, not populations. (Contains 5 figures.)

  14. A population based statistical model for daily geometric variations in the thorax

    NARCIS (Netherlands)

    Szeto, Yenny Z.; Witte, Marnix G.; van Herk, Marcel; Sonke, Jan-Jakob

    2017-01-01

    To develop a population based statistical model of the systematic interfraction geometric variations between the planning CT and first treatment week of lung cancer patients for inclusion as uncertainty term in future probabilistic planning. Deformable image registrations between the planning CT and

  15. A statistical method for testing epidemiological results, as applied to the Hanford worker population

    International Nuclear Information System (INIS)

    Brodsky, A.

    1979-01-01

    Some recent reports of Mancuso, Stewart and Kneale claim findings of radiation-produced cancer in the Hanford worker population. These claims are based on statistical computations that use small differences in accumulated exposures between groups dying of cancer and groups dying of other causes; actual mortality and longevity were not reported. This paper presents a statistical method for evaluation of actual mortality and longevity longitudinally over time, as applied in a primary analysis of the mortality experience of the Hanford worker population. Although available, this method was not utilized in the Mancuso-Stewart-Kneale paper. The author's preliminary longitudinal analysis shows that the gross mortality experience of persons employed at Hanford during 1943-70 interval did not differ significantly from that of certain controls, when both employees and controls were selected from families with two or more offspring and comparison were matched by age, sex, race and year of entry into employment. This result is consistent with findings reported by Sanders (Health Phys. vol.35, 521-538, 1978). The method utilizes an approximate chi-square (1 D.F.) statistic for testing population subgroup comparisons, as well as the cumulation of chi-squares (1 D.F.) for testing the overall result of a particular type of comparison. The method is available for computer testing of the Hanford mortality data, and could also be adapted to morbidity or other population studies. (author)

  16. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  17. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...

  18. Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books

    Science.gov (United States)

    Kelley, Jane E.; Darragh, Janine J.

    2011-01-01

    Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…

  19. Margin improvement initiatives: realistic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chan, P.K.; Paquette, S. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Cunning, T.A. [Department of National Defence, Ottawa, ON (Canada); French, C.; Bonin, H.W. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Pandey, M. [Univ. of Waterloo, Waterloo, ON (Canada); Murchie, M. [Cameco Fuel Manufacturing, Port Hope, ON (Canada)

    2014-07-01

    With reactor core aging, safety margins are particularly tight. Two realistic and practical approaches are proposed here to recover margins. The first project is related to the use of a small amount of neutron absorbers in CANDU Natural Uranium (NU) fuel bundles. Preliminary results indicate that the fuelling transient and subsequent reactivity peak can be lowered to improve the reactor's operating margins, with minimal impact on burnup when less than 1000 mg of absorbers is added to a fuel bundle. The second project involves the statistical analysis of fuel manufacturing data to demonstrate safety margins. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to generate input for ELESTRES and ELOCA. It is found that the fuel response distributions are far below industrial failure limits, implying that margin exists in the current fuel design. (author)

  20. Detection and statistics of gusts

    DEFF Research Database (Denmark)

    Hannesdóttir, Ásta; Kelly, Mark C.; Mann, Jakob

    In this project, a more realistic representation of gusts, based on statistical analysis, will account for the variability observed in real-world gusts. The gust representation will focus on temporal, spatial, and velocity scales that are relevant for modern wind turbines and which possibly affect...

  1. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  2. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  3. 4P: fast computing of population genetics statistics from large DNA polymorphism panels.

    Science.gov (United States)

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations.

  4. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  5. Cross-population validation of statistical distance as a measure of physiological dysregulation during aging.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi

    2014-09-01

    Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  7. Statistical characterization of wave propagation in mine environments

    KAUST Repository

    Bakir, Onur

    2012-07-01

    A computational framework for statistically characterizing electromagnetic (EM) wave propagation through mine tunnels and galleries is presented. The framework combines a multi-element probabilistic collocation (ME-PC) method with a novel domain-decomposition (DD) integral equation-based EM simulator to obtain statistics of electric fields due to wireless transmitters in realistic mine environments. © 2012 IEEE.

  8. Electron percolation in realistic models of carbon nanotube networks

    International Nuclear Information System (INIS)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-01-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models

  9. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  10. Mapping cell populations in flow cytometry data for cross‐sample comparison using the Friedman–Rafsky test statistic as a distance measure

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu

    2015-01-01

    Abstract Flow cytometry (FCM) is a fluorescence‐based single‐cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap‐FR, a novel method for cell population mapping across FCM samples. FlowMap‐FR is based on the Friedman–Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap‐FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap‐FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap‐FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap‐FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap‐FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback–Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL‐distance in distinguishing

  11. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell

  12. Linking macroscopic with microscopic neuroanatomy using synthetic neuronal populations.

    Science.gov (United States)

    Schneider, Calvin J; Cuntz, Hermann; Soltesz, Ivan

    2014-10-01

    Dendritic morphology has been shown to have a dramatic impact on neuronal function. However, population features such as the inherent variability in dendritic morphology between cells belonging to the same neuronal type are often overlooked when studying computation in neural networks. While detailed models for morphology and electrophysiology exist for many types of single neurons, the role of detailed single cell morphology in the population has not been studied quantitatively or computationally. Here we use the structural context of the neural tissue in which dendritic trees exist to drive their generation in silico. We synthesize the entire population of dentate gyrus granule cells, the most numerous cell type in the hippocampus, by growing their dendritic trees within their characteristic dendritic fields bounded by the realistic structural context of (1) the granule cell layer that contains all somata and (2) the molecular layer that contains the dendritic forest. This process enables branching statistics to be linked to larger scale neuroanatomical features. We find large differences in dendritic total length and individual path length measures as a function of location in the dentate gyrus and of somatic depth in the granule cell layer. We also predict the number of unique granule cell dendrites invading a given volume in the molecular layer. This work enables the complete population-level study of morphological properties and provides a framework to develop complex and realistic neural network models.

  13. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  14. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Science.gov (United States)

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  15. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    Directory of Open Access Journals (Sweden)

    Shi-Yi Chen

    Full Text Available Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i genetic diversity of DNA sequences, (ii statistical tests for neutral evolution, and (iii measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  16. Statistical Support for Analysis of the Social Stratification and Economic Inequality of the Country’s Population

    Directory of Open Access Journals (Sweden)

    Aksyonova Irina V.

    2017-12-01

    Full Text Available The aim of the article is to summarize the theoretical and methodological as well as information and analytical support for statistical research of economic and social stratification in society and conduct an analysis of the differentiation of the population of Ukraine in terms of the economic component of social inequality. The theoretical and methodological level of the research is studied, and criteria for social stratification and inequalities in society, systems, models and theories of social stratification of the population are singled out. The indicators of social and economic statistics regarding the differentiation of the population by income level are considered as the research tools. As a result of the analysis it was concluded that the economic inequality of the population leads to changes in the social structure, which requires formation of a new social stratification of society. The basis of social stratification is indicators of the population well-being, which require a comprehensive study. Prospects for further research in this area are the analysis of the components of economic inequality that determine and influence the social stratification of the population of the country, the formation of the middle class, and the study of the components of the human development index as a cross-currency indicator of the socio-economic inequality of the population.

  17. Evaluation of quantitative imaging methods for organ activity and residence time estimation using a population of phantoms having realistic variations in anatomy and uptake

    International Nuclear Information System (INIS)

    He Bin; Du Yong; Segars, W. Paul; Wahl, Richard L.; Sgouros, George; Jacene, Heather; Frey, Eric C.

    2009-01-01

    Estimating organ residence times is an essential part of patient-specific dosimetry for radioimmunotherapy (RIT). Quantitative imaging methods for RIT are often evaluated using a single physical or simulated phantom but are intended to be applied clinically where there is variability in patient anatomy, biodistribution, and biokinetics. To provide a more relevant evaluation, the authors have thus developed a population of phantoms with realistic variations in these factors and applied it to the evaluation of quantitative imaging methods both to find the best method and to demonstrate the effects of these variations. Using whole body scans and SPECT/CT images, organ shapes and time-activity curves of 111In ibritumomab tiuxetan were measured in dosimetrically important organs in seven patients undergoing a high dose therapy regimen. Based on these measurements, we created a 3D NURBS-based cardiac-torso (NCAT)-based phantom population. SPECT and planar data at realistic count levels were then simulated using previously validated Monte Carlo simulation tools. The projections from the population were used to evaluate the accuracy and variation in accuracy of residence time estimation methods that used a time series of SPECT and planar scans. Quantitative SPECT (QSPECT) reconstruction methods were used that compensated for attenuation, scatter, and the collimator-detector response. Planar images were processed with a conventional (CPlanar) method that used geometric mean attenuation and triple-energy window scatter compensation and a quantitative planar (QPlanar) processing method that used model-based compensation for image degrading effects. Residence times were estimated from activity estimates made at each of five time points. The authors also evaluated hybrid methods that used CPlanar or QPlanar time-activity curves rescaled to the activity estimated from a single QSPECT image. The methods were evaluated in terms of mean relative error and standard deviation of the

  18. Tuukka Kaidesoja on Critical Realist Transcendental Realism

    Directory of Open Access Journals (Sweden)

    Groff Ruth

    2015-09-01

    Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.

  19. Comparison of temporal realistic telecommunication base station exposure with worst-case estimation in two countries

    International Nuclear Information System (INIS)

    Mahfouz, Z.; Verloock, L.; Joseph, W.; Tanghe, E.; Gati, A.; Wiart, J.; Lautru, D.; Hanna, V. F.; Martens, L.

    2013-01-01

    The influence of temporal daily exposure to global system for mobile communications (GSM) and universal mobile telecommunications systems and high speed down-link packet access (UMTS-HSDPA) is investigated using spectrum analyser measurements in two countries, France and Belgium. Temporal variations and traffic distributions are investigated. Three different methods to estimate maximal electric-field exposure are compared. The maximal realistic (99 %) and the maximal theoretical extrapolation factor used to extrapolate the measured broadcast control channel (BCCH) for GSM and the common pilot channel (CPICH) for UMTS are presented and compared for the first time in the two countries. Similar conclusions are found in the two countries for both urban and rural areas: worst-case exposure assessment overestimates realistic maximal exposure up to 5.7 dB for the considered example. In France, the values are the highest, because of the higher population density. The results for the maximal realistic extrapolation factor at the weekdays are similar to those from weekend days. (authors)

  20. Comparison of temporal realistic telecommunication base station exposure with worst-case estimation in two countries.

    Science.gov (United States)

    Mahfouz, Zaher; Verloock, Leen; Joseph, Wout; Tanghe, Emmeric; Gati, Azeddine; Wiart, Joe; Lautru, David; Hanna, Victor Fouad; Martens, Luc

    2013-12-01

    The influence of temporal daily exposure to global system for mobile communications (GSM) and universal mobile telecommunications systems and high speed downlink packet access (UMTS-HSDPA) is investigated using spectrum analyser measurements in two countries, France and Belgium. Temporal variations and traffic distributions are investigated. Three different methods to estimate maximal electric-field exposure are compared. The maximal realistic (99 %) and the maximal theoretical extrapolation factor used to extrapolate the measured broadcast control channel (BCCH) for GSM and the common pilot channel (CPICH) for UMTS are presented and compared for the first time in the two countries. Similar conclusions are found in the two countries for both urban and rural areas: worst-case exposure assessment overestimates realistic maximal exposure up to 5.7 dB for the considered example. In France, the values are the highest, because of the higher population density. The results for the maximal realistic extrapolation factor at the weekdays are similar to those from weekend days.

  1. A statistical assessment of population trends for data deficient Mexican amphibians

    Directory of Open Access Journals (Sweden)

    Esther Quintero

    2014-12-01

    Full Text Available Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats.Methods. We harvested data from the Encyclopedia of Life (EOL and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions.Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  2. A statistical assessment of population trends for data deficient Mexican amphibians.

    Science.gov (United States)

    Quintero, Esther; Thessen, Anne E; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara

    2014-01-01

    Background. Mexico has the world's fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species' risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.

  3. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  4. Radiation belt seed population and its association with the relativistic electron dynamics: A statistical study: Radiation Belt Seed Population

    International Nuclear Information System (INIS)

    Tang, C. L.; Wang, Y. X.; Ni, B.; Zhang, J.-C.

    2017-01-01

    Using the Van Allen Probes data, we study the radiation belt seed population and it associated with the relativistic electron dynamics during 74 geomagnetic storm events. Based on the flux changes of 1 MeV electrons before and after the storm peak, these storm events are divided into two groups of “non-preconditioned” and “preconditioned”. The statistical study shows that the storm intensity is of significant importance for the distribution of the seed population (336 keV electrons) in the outer radiation belt. However, substorm intensity can also be important to the evolution of the seed population for some geomagnetic storm events. For non-preconditioned storm events, the correlation between the peak fluxes and their L-shell locations of the seed population and relativistic electrons (592 keV, 1.0 MeV, 1.8 MeV, and 2.1 MeV) is consistent with the energy-dependent dynamic processes in the outer radiation belt. For preconditioned storm events, the correlation between the features of the seed population and relativistic electrons is not fully consistent with the energy-dependent processes. It is suggested that the good correlation between the radiation belt seed population and ≤1.0 MeV electrons contributes to the prediction of the evolution of ≤1.0 MeV electrons in the Earth’s outer radiation belt during periods of geomagnetic storms.

  5. 15 CFR 50.40 - Fee structure for statistics for city blocks in the 1980 Census of Population and Housing.

    Science.gov (United States)

    2010-01-01

    ... blocks in the 1980 Census of Population and Housing. 50.40 Section 50.40 Commerce and Foreign Trade... the 1980 Census of Population and Housing. (a) As part of the regular program of the 1980 census, the Census Bureau will publish printed reports containing certain summary population and housing statistics...

  6. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  7. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  8. Exophobic Quasi-Realistic Heterotic String Vacua

    CERN Document Server

    Assel, Benjamin; Faraggi, Alon E; Kounnas, Costas; Rizos, John

    2009-01-01

    We demonstrate the existence of heterotic-string vacua that are free of massless exotic fields. The need to break the non-Abelian GUT symmetries in k=1 heterotic-string models by Wilson lines, while preserving the GUT embedding of the weak-hypercharge and the GUT prediction sin^2\\theta_w(M(GUT))=3/8, necessarily implies that the models contain states with fractional electric charge. Such states are severely restricted by observations, and must be confined or sufficiently massive and diluted. We construct the first quasi-realistic heterotic-string models in which the exotic states do not appear in the massless spectrum, and only exist, as they must, in the massive spectrum. The SO(10) GUT symmetry is broken to the Pati-Salam subgroup. Our PS heterotic-string models contain adequate Higgs representations to break the GUT and electroweak symmetry, as well as colour Higgs triplets that can be used for the missing partner mechanism. By statistically sampling the space of Pati-Salam vacua we demonstrate the abundan...

  9. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  10. Scaling up complex interventions: insights from a realist synthesis.

    Science.gov (United States)

    Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan

    2016-12-19

    Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant

  11. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  12. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  13. Census 2012 Core Based Statistical Area (CBSAs) Polygons with Population Estimates, US EPA Region 9, 2014, USCB

    Data.gov (United States)

    U.S. Environmental Protection Agency — Core Based Statistical Areas (CBSAs) from the US Census Bureau's TIGER files download website, joined with 2014 population estimate data downloaded from the US...

  14. Non-statistically populated autoionizing levels of Li-like carbon: Hidden-crossings

    International Nuclear Information System (INIS)

    Deveney, E.F.; Krause, H.F.; Jones, N.L.

    1995-01-01

    The intensities of the Auger-electron lines from autoionizing (AI) states of Li-like (1s2s2l) configurations excited in ion-atom collisions vary as functions of the collision parameters such as, for example, the collision velocity. A statistical population of the three-electron levels is at best incomplete and underscores the intricate dynamical development of the electronic states. The authors compare several experimental studies to calculations using ''hidden-crossing'' techniques to explore some of the details of these Auger-electron intensity variation phenomena. The investigations show promising results suggesting that Auger-electron intensity variations can be used to probe collision dynamics

  15. CONSTRUCTION OF STATISTICAL MODEL THE OVERALL POPULATION OF THE RUSSIAN FEDERATION ON THE BASIS OF RETROSPECTIVE FORECAST

    Directory of Open Access Journals (Sweden)

    Ol’ga Sergeevna Kochegarova

    2017-06-01

    Full Text Available The article considers the retrospective forecast of the total population of the Russian Federation for the period 2001–2017. comparative analysis of the actual values of the total population of the Russian Federation on 20.03.2017 according to the Federal state statistics service of the Russian Federation received the forecast value. Model selection forecasting was carried out by the method of selection of growth curves on the basis of correlation and regression analysis and least squares method. A quality selection of the regression equation was determined with the least error of approximation of time series levels. Analysis of the significance of the selected regression equation by statistical methods allows to make a conclusion about the right choice of models and the possibility of its use for population estimates. Purpose: to estimate the significance of selected regression equations for the forecast of the population. Methodology in article: the fitting of growth curves on the basis of correlation and regression analysis and least squares method. Results: received confirmation of the effectiveness of the constructed model for forecasts of demographic processes. Practical implications: the obtained results should be used when building forecasts of demographic processes.

  16. Statistical properties of the nuclear shell-model Hamiltonian

    International Nuclear Information System (INIS)

    Dias, H.; Hussein, M.S.; Oliveira, N.A. de

    1986-01-01

    The statistical properties of realistic nuclear shell-model Hamiltonian are investigated in sd-shell nuclei. The probability distribution of the basic-vector amplitude is calculated and compared with the Porter-Thomas distribution. Relevance of the results to the calculation of the giant resonance mixing parameter is pointed out. (Author) [pt

  17. Comparative study of the effectiveness of three learning environments: Hyper-realistic virtual simulations, traditional schematic simulations and traditional laboratory

    Directory of Open Access Journals (Sweden)

    Maria Isabel Suero

    2011-10-01

    Full Text Available This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output. This new virtual environment concept, which we call hyper-realistic, transcends basic schematic simulation; it provides the user with a more realistic perception of a physical phenomenon being simulated. We compared the learning achievements of three equivalent, homogeneous groups of undergraduates—an experimental group who used only the hyper-realistic virtual laboratory, a first control group who used a schematic simulation, and a second control group who used the traditional laboratory. The three groups received the same theoretical preparation and carried out equivalent practicals in their respective learning environments. The topic chosen for the experiment was optical aberrations. An analysis of variance applied to the data of the study demonstrated a statistically significant difference (p value <0.05 between the three groups. The learning achievements attained by the group using the hyper-realistic virtual environment were 6.1 percentage points higher than those for the group using the traditional schematic simulations and 9.5 percentage points higher than those for the group using the traditional laboratory.

  18. Can Family Planning Service Statistics Be Used to Track Population-Level Outcomes?

    Science.gov (United States)

    Magnani, Robert J; Ross, John; Williamson, Jessica; Weinberger, Michelle

    2018-03-21

    The need for annual family planning program tracking data under the Family Planning 2020 (FP2020) initiative has contributed to renewed interest in family planning service statistics as a potential data source for annual estimates of the modern contraceptive prevalence rate (mCPR). We sought to assess (1) how well a set of commonly recorded data elements in routine service statistics systems could, with some fairly simple adjustments, track key population-level outcome indicators, and (2) whether some data elements performed better than others. We used data from 22 countries in Africa and Asia to analyze 3 data elements collected from service statistics: (1) number of contraceptive commodities distributed to clients, (2) number of family planning service visits, and (3) number of current contraceptive users. Data quality was assessed via analysis of mean square errors, using the United Nations Population Division World Contraceptive Use annual mCPR estimates as the "gold standard." We also examined the magnitude of several components of measurement error: (1) variance, (2) level bias, and (3) slope (or trend) bias. Our results indicate modest levels of tracking error for data on commodities to clients (7%) and service visits (10%), and somewhat higher error rates for data on current users (19%). Variance and slope bias were relatively small for all data elements. Level bias was by far the largest contributor to tracking error. Paired comparisons of data elements in countries that collected at least 2 of the 3 data elements indicated a modest advantage of data on commodities to clients. None of the data elements considered was sufficiently accurate to be used to produce reliable stand-alone annual estimates of mCPR. However, the relatively low levels of variance and slope bias indicate that trends calculated from these 3 data elements can be productively used in conjunction with the Family Planning Estimation Tool (FPET) currently used to produce annual m

  19. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  20. Statistical characteristics of dynamics for population migration driven by the economic interests

    Science.gov (United States)

    Huo, Jie; Wang, Xu-Ming; Zhao, Ning; Hao, Rui

    2016-06-01

    Population migration typically occurs under some constraints, which can deeply affect the structure of a society and some other related aspects. Therefore, it is critical to investigate the characteristics of population migration. Data from the China Statistical Yearbook indicate that the regional gross domestic product per capita relates to the population size via a linear or power-law relation. In addition, the distribution of population migration sizes or relative migration strength introduced here is dominated by a shifted power-law relation. To reveal the mechanism that creates the aforementioned distributions, a dynamic model is proposed based on the population migration rule that migration is facilitated by higher financial gains and abated by fewer employment opportunities at the destination, considering the migration cost as a function of the migration distance. The calculated results indicate that the distribution of the relative migration strength is governed by a shifted power-law relation, and that the distribution of migration distances is dominated by a truncated power-law relation. These results suggest the use of a power-law to fit a distribution may be not always suitable. Additionally, from the modeling framework, one can infer that it is the randomness and determinacy that jointly create the scaling characteristics of the distributions. The calculation also demonstrates that the network formed by active nodes, representing the immigration and emigration regions, usually evolves from an ordered state with a non-uniform structure to a disordered state with a uniform structure, which is evidenced by the increasing structural entropy.

  1. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    Science.gov (United States)

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Polychromatic Iterative Statistical Material Image Reconstruction for Photon-Counting Computed Tomography

    Directory of Open Access Journals (Sweden)

    Thomas Weidinger

    2016-01-01

    Full Text Available This work proposes a dedicated statistical algorithm to perform a direct reconstruction of material-decomposed images from data acquired with photon-counting detectors (PCDs in computed tomography. It is based on local approximations (surrogates of the negative logarithmic Poisson probability function. Exploiting the convexity of this function allows for parallel updates of all image pixels. Parallel updates can compensate for the rather slow convergence that is intrinsic to statistical algorithms. We investigate the accuracy of the algorithm for ideal photon-counting detectors. Complementarily, we apply the algorithm to simulation data of a realistic PCD with its spectral resolution limited by K-escape, charge sharing, and pulse-pileup. For data from both an ideal and realistic PCD, the proposed algorithm is able to correct beam-hardening artifacts and quantitatively determine the material fractions of the chosen basis materials. Via regularization we were able to achieve a reduction of image noise for the realistic PCD that is up to 90% lower compared to material images form a linear, image-based material decomposition using FBP images. Additionally, we find a dependence of the algorithms convergence speed on the threshold selection within the PCD.

  3. Evaluation of photovoltaic panel temperature in realistic scenarios

    International Nuclear Information System (INIS)

    Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang

    2016-01-01

    Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.

  4. Generating realistic environments for cyber operations development, testing, and training

    Science.gov (United States)

    Berk, Vincent H.; Gregorio-de Souza, Ian; Murphy, John P.

    2012-06-01

    Training eective cyber operatives requires realistic network environments that incorporate the structural and social complexities representative of the real world. Network trac generators facilitate repeatable experiments for the development, training and testing of cyber operations. However, current network trac generators, ranging from simple load testers to complex frameworks, fail to capture the realism inherent in actual environments. In order to improve the realism of network trac generated by these systems, it is necessary to quantitatively measure the level of realism in generated trac with respect to the environment being mimicked. We categorize realism measures into statistical, content, and behavioral measurements, and propose various metrics that can be applied at each level to indicate how eectively the generated trac mimics the real world.

  5. Modeling individual movement decisions of brown hare (Lepus europaeus) as a key concept for realistic spatial behavior and exposure: A population model for landscape-level risk assessment.

    Science.gov (United States)

    Kleinmann, Joachim U; Wang, Magnus

    2017-09-01

    Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.

  6. Stage-Structured Population Dynamics of AEDES AEGYPTI

    Science.gov (United States)

    Yusoff, Nuraini; Budin, Harun; Ismail, Salemah

    Aedes aegypti is the main vector in the transmission of dengue fever, a vector-borne disease affecting world population living in tropical and sub-tropical countries. Better understanding of the dynamics of its population growth will help in the efforts of controlling the spread of this disease. In looking at the population dynamics of Aedes aegypti, this paper explored the stage-structured modeling of the population growth of the mosquito using the matrix population model. The life cycle of the mosquito was divided into five stages: eggs, larvae, pupae, adult1 and adult2. Developmental rates were obtained for the average Malaysian temperature and these were used in constructing the transition matrix for the matrix model. The model, which was based only on temperature, projected that the population of Aedes aegypti will blow up with time, which is not realistic. For further work, other factors need to be taken into account to obtain a more realistic result.

  7. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  8. Statistical thermodynamics of clustered populations.

    Science.gov (United States)

    Matsoukas, Themis

    2014-08-01

    We present a thermodynamic theory for a generic population of M individuals distributed into N groups (clusters). We construct the ensemble of all distributions with fixed M and N, introduce a selection functional that embodies the physics that governs the population, and obtain the distribution that emerges in the scaling limit as the most probable among all distributions consistent with the given physics. We develop the thermodynamics of the ensemble and establish a rigorous mapping to regular thermodynamics. We treat the emergence of a so-called giant component as a formal phase transition and show that the criteria for its emergence are entirely analogous to the equilibrium conditions in molecular systems. We demonstrate the theory by an analytic model and confirm the predictions by Monte Carlo simulation.

  9. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  10. Bivalves: From individual to population modelling

    Science.gov (United States)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  11. Are there realistically interpretable local theories?

    International Nuclear Information System (INIS)

    d'Espagnat, B.

    1989-01-01

    Although it rests on strongly established proofs, the statement that no realistically interpretable local theory is compatible with some experimentally testable predictions of quantum mechanics seems at first sight to be incompatible with a few general ideas and clear-cut statements occurring in recent theoretical work by Griffiths, Omnes, and Ballentine and Jarrett. It is shown here that in fact none of the developments due to these authors can be considered as a realistically interpretable local theory, so that there is no valid reason for suspecting that the existing proofs of the statement in question are all flawed

  12. A statistical framework for the validation of a population exposure model based on personal exposure data

    Science.gov (United States)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  13. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    Science.gov (United States)

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  14. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  15. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  16. Population Genomics and the Statistical Values of Race:An Interdisciplinary Perspective on the Biological Classification of Human Populations and Implications for Clinical Genetic Epidemiological Research

    Directory of Open Access Journals (Sweden)

    Koffi N. Maglo

    2016-02-01

    Full Text Available The biological status and biomedical significance of the concept of race as applied to humans continue to be contentious issues despite the use of advanced statistical and clustering methods to determine continental ancestry. It is thus imperative for researchers to understand the limitations as well as potential uses of the concept of race in biology and biomedicine. This paper deals with the theoretical assumptions behind cluster analysis in human population genomics. Adopting an interdisciplinary approach, it demonstrates that the hypothesis that attributes the clustering of human populations to frictional effects of landform barriers at continental boundaries is empirically incoherent. It then contrasts the scientific status of the cluster and cline constructs in human population genomics, and shows how cluster may be instrumentally produced. It also shows how statistical values of race vindicate Darwin’s argument that race is evolutionarily meaningless. Finally, the paper explains why, due to spatiotemporal parameters, evolutionary forces and socio-cultural factors influencing population structure, continental ancestry may be pragmatically relevant to global and public health genomics. Overall, this work demonstrates that, from a biological systematic and evolutionary taxonomical perspective, human races/continental groups or clusters have no natural meaning or objective biological reality. In fact, the utility of racial categorizations in research and in clinics can be explained by spatiotemporal parameters, socio-cultural factors and evolutionary forces affecting disease causation and treatment response.

  17. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  18. Generating realistic images using Kray

    Science.gov (United States)

    Tanski, Grzegorz

    2004-07-01

    Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.

  19. "Statistical Techniques for Particle Physics" (2/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  20. "Statistical Techniques for Particle Physics" (1/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  1. "Statistical Techniques for Particle Physics" (4/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  2. "Statistical Techniques for Particle Physics" (3/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  3. Resolving conflict realistically in today's health care environment.

    Science.gov (United States)

    Smith, S B; Tutor, R S; Phillips, M L

    2001-11-01

    Conflict is a natural part of human interaction, and when properly addressed, results in improved interpersonal relationships and positive organizational culture. Unchecked conflict may escalate to verbal and physical violence. Conflict that is unresolved creates barriers for people, teams, organizational growth, and productivity, leading to cultural disintegration within the establishment. By relying on interdependence and professional collaboration, all parties involved grow and, in turn, benefit the organization and population served. When used in a constructive manner, conflict resolution can help all parties involved see the whole picture, thus allowing freedom for growth and change. Conflict resolution is accomplished best when emotions are controlled before entering into negotiation. Positive confrontation, problem solving, and negotiation are processes used to realistically resolve conflict. Everyone walks away a winner when conflict is resolved in a positive, professional manner (Stone, 1999).

  4. Generating Geospatially Realistic Driving Patterns Derived From Clustering Analysis Of Real EV Driving Data

    DEFF Research Database (Denmark)

    Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob

    2014-01-01

    In order to provide a vehicle fleet that realistically represents the predicted Electric Vehicle (EV) penetration for the future, a model is required that mimics people driving behaviour rather than simply playing back collected data. When the focus is broadened from on a traditional user...... scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...

  5. Urban renewal, gentrification and health equity: a realist perspective.

    Science.gov (United States)

    Mehdipanah, Roshanak; Marra, Giulia; Melis, Giulia; Gelormino, Elena

    2018-04-01

    Up to now, research has focused on the effects of urban renewal programs and their impacts on health. While some of this research points to potential negative health effects due to gentrification, evidence that addresses the complexity associated with this relation is much needed. This paper seeks to better understand when, why and how health inequities arise from urban renewal interventions resulting in gentrification. A realist review, a qualitative systematic review method, aimed to better explain the relation between context, mechanism and outcomes, was used. A literature search was done to identify theoretical models of how urban renewal programs can result in gentrification, which in turn could have negative impacts on health. A systematic approach was then used to identify peer-reviewed studies that provided evidence to support or refute the initial assumptions. Urban renewal programs that resulted in gentrification tended to have negative health effects primarily in residents that were low-income. Urban renewal policies that were inclusive of populations that are vulnerable, from the beginning were less likely to result in gentrification and more likely to positively impact health through physical and social improvements. Research has shown urban renewal policies have significant impacts on populations that are vulnerable and those that result in gentrification can result in negative health consequences for this population. A better understanding of this is needed to impact future policies and advocate for a community-participatory model that includes such populations in the early planning stages.

  6. A statistical investigation into the stability of iris recognition in diverse population sets

    Science.gov (United States)

    Howard, John J.; Etter, Delores M.

    2014-05-01

    Iris recognition is increasingly being deployed on population wide scales for important applications such as border security, social service administration, criminal identification and general population management. The error rates for this incredibly accurate form of biometric identification are established using well known, laboratory quality datasets. However, it is has long been acknowledged in biometric theory that not all individuals have the same likelihood of being correctly serviced by a biometric system. Typically, techniques for identifying clients that are likely to experience a false non-match or a false match error are carried out on a per-subject basis. This research makes the novel hypothesis that certain ethnical denominations are more or less likely to experience a biometric error. Through established statistical techniques, we demonstrate this hypothesis to be true and document the notable effect that the ethnicity of the client has on iris similarity scores. Understanding the expected impact of ethnical diversity on iris recognition accuracy is crucial to the future success of this technology as it is deployed in areas where the target population consists of clientele from a range of geographic backgrounds, such as border crossings and immigration check points.

  7. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Burgos, Juan; Mahadevan, Aditya; Manavi, Kasra; Murray, Luke; Kodochygov, Anton; Zourntos, Takis; Amato, Nancy M.

    2011-01-01

    be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility

  8. On Realistically Attacking Tor with Website Fingerprinting

    Directory of Open Access Journals (Sweden)

    Wang Tao

    2016-10-01

    Full Text Available Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.

  9. Iterated interactions method. Realistic NN potential

    International Nuclear Information System (INIS)

    Gorbatov, A.M.; Skopich, V.L.; Kolganova, E.A.

    1991-01-01

    The method of iterated potential is tested in the case of realistic fermionic systems. As a base for comparison calculations of the 16 O system (using various versions of realistic NN potentials) by means of the angular potential-function method as well as operators of pairing correlation were used. The convergence of genealogical series is studied for the central Malfliet-Tjon potential. In addition the mathematical technique of microscopical calculations is improved: new equations for correlators in odd states are suggested and the technique of leading terms was applied for the first time to calculations of heavy p-shell nuclei in the basis of angular potential functions

  10. Realistic Material Appearance Modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Filip, Jiří; Hatka, Martin

    2010-01-01

    Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf

  11. A Radiosity Approach to Realistic Image Synthesis

    Science.gov (United States)

    1992-12-01

    AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation

  12. An investigation of the statistical power of neutrality tests based on comparative and population genetic data

    DEFF Research Database (Denmark)

    Zhai, Weiwei; Nielsen, Rasmus; Slatkin, Montgomery

    2009-01-01

    In this report, we investigate the statistical power of several tests of selective neutrality based on patterns of genetic diversity within and between species. The goal is to compare tests based solely on population genetic data with tests using comparative data or a combination of comparative...... and population genetic data. We show that in the presence of repeated selective sweeps on relatively neutral background, tests based on the d(N)/d(S) ratios in comparative data almost always have more power to detect selection than tests based on population genetic data, even if the overall level of divergence...... selection. The Hudson-Kreitman-Aguadé test is the most powerful test for detecting positive selection among the population genetic tests investigated, whereas McDonald-Kreitman test typically has more power to detect negative selection. We discuss our findings in the light of the discordant results obtained...

  13. ddClone: joint statistical inference of clonal populations from single cell and bulk tumour sequencing data.

    Science.gov (United States)

    Salehi, Sohrab; Steif, Adi; Roth, Andrew; Aparicio, Samuel; Bouchard-Côté, Alexandre; Shah, Sohrab P

    2017-03-01

    Next-generation sequencing (NGS) of bulk tumour tissue can identify constituent cell populations in cancers and measure their abundance. This requires computational deconvolution of allelic counts from somatic mutations, which may be incapable of fully resolving the underlying population structure. Single cell sequencing (SCS) is a more direct method, although its replacement of NGS is impeded by technical noise and sampling limitations. We propose ddClone, which analytically integrates NGS and SCS data, leveraging their complementary attributes through joint statistical inference. We show on real and simulated datasets that ddClone produces more accurate results than can be achieved by either method alone.

  14. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    method provides a simple goodness of t test by comparing the observed SFS with the expected SFS under a given model of population size changes. By the use of Monte Carlo estimation the expected time between coalescent events can be estimated and the expected SFS can thereby be evaluated. Using......). The OR is interpreted as the eect of an exposure on the probability of being diseased at the end of follow-up, while the interpretation of the IRR is the eect of an exposure on the probability of becoming diseased. Through a simulation study, the OR from a classical case-control study is shown to be an inconsistent...... the classical chi-square statistics we are able to infer single parameter models. Multiple parameter models, e.g. multiple epochs, are harder to identify. By introducing the inference of population size back in time as an inverse problem, the second procedure applies the theory of smoothing splines to infer...

  15. Evolutionary approaches for the reverse-engineering of gene regulatory networks: A study on a biologically realistic dataset

    Directory of Open Access Journals (Sweden)

    Gidrol Xavier

    2008-02-01

    Full Text Available Abstract Background Inferring gene regulatory networks from data requires the development of algorithms devoted to structure extraction. When only static data are available, gene interactions may be modelled by a Bayesian Network (BN that represents the presence of direct interactions from regulators to regulees by conditional probability distributions. We used enhanced evolutionary algorithms to stochastically evolve a set of candidate BN structures and found the model that best fits data without prior knowledge. Results We proposed various evolutionary strategies suitable for the task and tested our choices using simulated data drawn from a given bio-realistic network of 35 nodes, the so-called insulin network, which has been used in the literature for benchmarking. We assessed the inferred models against this reference to obtain statistical performance results. We then compared performances of evolutionary algorithms using two kinds of recombination operators that operate at different scales in the graphs. We introduced a niching strategy that reinforces diversity through the population and avoided trapping of the algorithm in one local minimum in the early steps of learning. We show the limited effect of the mutation operator when niching is applied. Finally, we compared our best evolutionary approach with various well known learning algorithms (MCMC, K2, greedy search, TPDA, MMHC devoted to BN structure learning. Conclusion We studied the behaviour of an evolutionary approach enhanced by niching for the learning of gene regulatory networks with BN. We show that this approach outperforms classical structure learning methods in elucidating the original model. These results were obtained for the learning of a bio-realistic network and, more importantly, on various small datasets. This is a suitable approach for learning transcriptional regulatory networks from real datasets without prior knowledge.

  16. A heuristic statistical stopping rule for iterative reconstruction in emission tomography

    International Nuclear Information System (INIS)

    Ben Bouallegue, F.; Mariano-Goulart, D.; Crouzet, J.F.

    2013-01-01

    We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for maximum likelihood expectation maximization (MLEM) reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the Geant4 application in emission tomography (GATE) platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time. (author)

  17. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  18. The Influence of Spatial Configuration of Residential Area and Vector Populations on Dengue Incidence Patterns in an Individual-Level Transmission Model.

    Science.gov (United States)

    Kang, Jeon-Young; Aldstadt, Jared

    2017-07-15

    Dengue is a mosquito-borne infectious disease that is endemic in tropical and subtropical countries. Many individual-level simulation models have been developed to test hypotheses about dengue virus transmission. Often these efforts assume that human host and mosquito vector populations are randomly or uniformly distributed in the environment. Although, the movement of mosquitoes is affected by spatial configuration of buildings and mosquito populations are highly clustered in key buildings, little research has focused on the influence of the local built environment in dengue transmission models. We developed an agent-based model of dengue transmission in a village setting to test the importance of using realistic environments in individual-level models of dengue transmission. The results from one-way ANOVA analysis of simulations indicated that the differences between scenarios in terms of infection rates as well as serotype-specific dominance are statistically significant. Specifically, the infection rates in scenarios of a realistic environment are more variable than those of a synthetic spatial configuration. With respect to dengue serotype-specific cases, we found that a single dengue serotype is more often dominant in realistic environments than in synthetic environments. An agent-based approach allows a fine-scaled analysis of simulated dengue incidence patterns. The results provide a better understanding of the influence of spatial heterogeneity on dengue transmission at a local scale.

  19. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these

  20. A Low-cost System for Generating Near-realistic Virtual Actors

    Science.gov (United States)

    Afifi, Mahmoud; Hussain, Khaled F.; Ibrahim, Hosny M.; Omar, Nagwa M.

    2015-06-01

    Generating virtual actors is one of the most challenging fields in computer graphics. The reconstruction of a realistic virtual actor has been paid attention by the academic research and the film industry to generate human-like virtual actors. Many movies were acted by human-like virtual actors, where the audience cannot distinguish between real and virtual actors. The synthesis of realistic virtual actors is considered a complex process. Many techniques are used to generate a realistic virtual actor; however they usually require expensive hardware equipment. In this paper, a low-cost system that generates near-realistic virtual actors is presented. The facial features of the real actor are blended with a virtual head that is attached to the actor's body. Comparing with other techniques that generate virtual actors, the proposed system is considered a low-cost system that requires only one camera that records the scene without using any expensive hardware equipment. The results of our system show that the system generates good near-realistic virtual actors that can be used on many applications.

  1. Problem Posing with Realistic Mathematics Education Approach in Geometry Learning

    Science.gov (United States)

    Mahendra, R.; Slamet, I.; Budiyono

    2017-09-01

    One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.

  2. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  3. Novel high-fidelity realistic explosion damage simulation for urban environments

    Science.gov (United States)

    Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya

    2010-04-01

    Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.

  4. A Realistic Seizure Prediction Study Based on Multiclass SVM.

    Science.gov (United States)

    Direito, Bruno; Teixeira, César A; Sales, Francisco; Castelo-Branco, Miguel; Dourado, António

    2017-05-01

    A patient-specific algorithm, for epileptic seizure prediction, based on multiclass support-vector machines (SVM) and using multi-channel high-dimensional feature sets, is presented. The feature sets, combined with multiclass classification and post-processing schemes aim at the generation of alarms and reduced influence of false positives. This study considers 216 patients from the European Epilepsy Database, and includes 185 patients with scalp EEG recordings and 31 with intracranial data. The strategy was tested over a total of 16,729.80[Formula: see text]h of inter-ictal data, including 1206 seizures. We found an overall sensitivity of 38.47% and a false positive rate per hour of 0.20. The performance of the method achieved statistical significance in 24 patients (11% of the patients). Despite the encouraging results previously reported in specific datasets, the prospective demonstration on long-term EEG recording has been limited. Our study presents a prospective analysis of a large heterogeneous, multicentric dataset. The statistical framework based on conservative assumptions, reflects a realistic approach compared to constrained datasets, and/or in-sample evaluations. The improvement of these results, with the definition of an appropriate set of features able to improve the distinction between the pre-ictal and nonpre-ictal states, hence minimizing the effect of confounding variables, remains a key aspect.

  5. Inferring Demographic History Using Two-Locus Statistics.

    Science.gov (United States)

    Ragsdale, Aaron P; Gutenkunst, Ryan N

    2017-06-01

    Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.

  6. Progress in realistic LOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)

    1994-12-31

    While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs

  7. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  8. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  9. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  10. Statistical Compression for Climate Model Output

    Science.gov (United States)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  11. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  12. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  13. Statistical Change Detection for Diagnosis of Buoyancy Element Defects on Moored Floating Vessels

    DEFF Research Database (Denmark)

    Blanke, Mogens; Fang, Shaoji; Galeazzi, Roberto

    2012-01-01

    . After residual generation, statistical change detection scheme is derived from mathematical models supported by experimental data. To experimentally verify loss of an underwater buoyancy element, an underwater line breaker is designed to create realistic replication of abrupt faults. The paper analyses...... the properties of residuals and suggests a dedicated GLRT change detector based on a vector residual. Special attention is paid to threshold selection for non ideal (non-IID) test statistics....

  14. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    Science.gov (United States)

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  16. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  17. Effects of the neonicotinoid pesticide thiamethoxam at field-realistic levels on microcolonies of Bombus terrestris worker bumble bees.

    Science.gov (United States)

    Laycock, Ian; Cotterell, Katie C; O'Shea-Wheller, Thomas A; Cresswell, James E

    2014-02-01

    Neonicotinoid pesticides are currently implicated in the decline of wild bee populations. Bumble bees, Bombus spp., are important wild pollinators that are detrimentally affected by ingestion of neonicotinoid residues. To date, imidacloprid has been the major focus of study into the effects of neonicotinoids on bumble bee health, but wild populations are increasingly exposed to alternative neonicotinoids such as thiamethoxam. To investigate whether environmentally realistic levels of thiamethoxam affect bumble bee performance over a realistic exposure period, we exposed queenless microcolonies of Bombus terrestris L. workers to a wide range of dosages up to 98 μgkg(-1) in dietary syrup for 17 days. Results showed that bumble bee workers survived fewer days when presented with syrup dosed at 98 μg thiamethoxamkg(-1), while production of brood (eggs and larvae) and consumption of syrup and pollen in microcolonies were significantly reduced by thiamethoxam only at the two highest concentrations (39, 98 μgkg(-1)). In contrast, we found no detectable effect of thiamethoxam at levels typically found in the nectars of treated crops (between 1 and 11 μgkg(-1)). By comparison with published data, we demonstrate that during an exposure to field-realistic concentrations lasting approximately two weeks, brood production in worker bumble bees is more sensitive to imidacloprid than thiamethoxam. We speculate that differential sensitivity arises because imidacloprid produces a stronger repression of feeding in bumble bees than thiamethoxam, which imposes a greater nutrient limitation on production of brood. © 2013 Published by Elsevier Inc.

  18. Statistics and predictions of population, energy and environment problems

    International Nuclear Information System (INIS)

    Sobajima, Makoto

    1999-03-01

    In the situation that world's population, especially in developing countries, is rapidly growing, humankind is facing to global problems that they cannot steadily live unless they find individual places to live, obtain foods, and peacefully get energy necessary for living for centuries. For this purpose, humankind has to think what behavior they should take in the finite environment, talk, agree and execute. Though energy has been long respected as a symbol for improving living, demanded and used, they have come to limit the use making the global environment more serious. If there is sufficient energy not loading cost to the environment. If nuclear energy regarded as such one sustain the resource for long and has market competitiveness. What situation of realization of compensating new energy is now in the case the use of nuclear energy is restricted by the society fearing radioactivity. If there are promising ones for the future. One concerning with the study of energy cannot go without knowing these. The statistical materials compiled here are thought to be useful for that purpose, and are collected mainly from ones viewing future prediction based on past practices. Studies on the prediction is so important to have future measures that these data bases are expected to be improved for better accuracy. (author)

  19. Statistical methods in nuclear material accountancy: Past, present and future

    International Nuclear Information System (INIS)

    Pike, D.J.; Woods, A.J.

    1983-01-01

    The analysis of nuclear material inventory data is motivated by the desire to detect any loss or diversion of nuclear material, insofar as such detection may be feasible by statistical analysis of repeated inventory and throughput measurements. The early regulations, which laid down the specifications for the analysis of inventory data, were framed without acknowledging the essentially sequential nature of the data. It is the broad aim of this paper to discuss the historical nature of statistical analysis of inventory data including an evaluation of why statistical methods should be required at all. If it is accepted that statistical techniques are required, then two main areas require extensive discussion. First, it is important to assess the extent to which stated safeguards aims can be met in practice. Second, there is a vital need for reassessment of the statistical techniques which have been proposed for use in nuclear material accountancy. Part of this reassessment must involve a reconciliation of the apparent differences in philosophy shown by statisticians; but, in addition, the techniques themselves need comparative study to see to what extent they are capable of meeting realistic safeguards aims. This paper contains a brief review of techniques with an attempt to compare and contrast the approaches. It will be suggested that much current research is following closely similar lines, and that national and international bodies should encourage collaborative research and practical in-plant implementations. The techniques proposed require credibility and power; but at this point in time statisticians require credibility and a greater level of unanimity in their approach. A way ahead is proposed based on a clear specification of realistic safeguards aims, and a development of a unified statistical approach with encouragement for the performance of joint research. (author)

  20. Statistical selection : a way of thinking !

    NARCIS (Netherlands)

    Laan, van der P.; Aarts, E.H.L.; Eikelder, ten H.M.M.; Hemerik, C.; Rem, M.

    1995-01-01

    Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and

  1. Statistical selection : a way of thinking!

    NARCIS (Netherlands)

    Laan, van der P.

    1995-01-01

    Statistical selection of the best population is discussed in general terms and the principles of statistical selection procedures are presented. Advantages and disadvantages of Subset Selection, one of the main approaches, are indicated. The selection of an almost best population is considered and

  2. Replicate This! Creating Individual-Level Data from Summary Statistics Using R

    Science.gov (United States)

    Morse, Brendan J.

    2013-01-01

    Incorporating realistic data and research examples into quantitative (e.g., statistics and research methods) courses has been widely recommended for enhancing student engagement and comprehension. One way to achieve these ends is to use a data generator to emulate the data in published research articles. "MorseGen" is a free data generator that…

  3. Applying mixed reality to simulate vulnerable populations for practicing clinical communication skills.

    Science.gov (United States)

    Chuah, Joon Hao; Lok, Benjamin; Black, Erik

    2013-04-01

    Health sciences students often practice and are evaluated on interview and exam skills by working with standardized patients (people that role play having a disease or condition). However, standardized patients do not exist for certain vulnerable populations such as children and the intellectually disabled. As a result, students receive little to no exposure to vulnerable populations before becoming working professionals. To address this problem and thereby increase exposure to vulnerable populations, we propose using virtual humans to simulate members of vulnerable populations. We created a mixed reality pediatric patient that allowed students to practice pediatric developmental exams. Practicing several exams is necessary for students to understand how to properly interact with and correctly assess a variety of children. Practice also increases a student's confidence in performing the exam. Effective practice requires students to treat the virtual child realistically. Treating the child realistically might be affected by how the student and virtual child physically interact, so we created two object interaction interfaces - a natural interface and a mouse-based interface. We tested the complete mixed reality exam and also compared the two object interaction interfaces in a within-subjects user study with 22 participants. Our results showed that the participants accepted the virtual child as a child and treated it realistically. Participants also preferred the natural interface, but the interface did not affect how realistically participants treated the virtual child.

  4. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    Science.gov (United States)

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  5. Effects of realistic force feedback in a robotic assisted minimally invasive surgery system.

    Science.gov (United States)

    Moradi Dalvand, Mohsen; Shirinzadeh, Bijan; Nahavandi, Saeid; Smith, Julian

    2014-06-01

    Robotic assisted minimally invasive surgery systems not only have the advantages of traditional laparoscopic procedures but also restore the surgeon's hand-eye coordination and improve the surgeon's precision by filtering hand tremors. Unfortunately, these benefits have come at the expense of the surgeon's ability to feel. Several research efforts have already attempted to restore this feature and study the effects of force feedback in robotic systems. The proposed methods and studies have some shortcomings. The main focus of this research is to overcome some of these limitations and to study the effects of force feedback in palpation in a more realistic fashion. A parallel robot assisted minimally invasive surgery system (PRAMiSS) with force feedback capabilities was employed to study the effects of realistic force feedback in palpation of artificial tissue samples. PRAMiSS is capable of actually measuring the tip/tissue interaction forces directly from the surgery site. Four sets of experiments using only vision feedback, only force feedback, simultaneous force and vision feedback and direct manipulation were conducted to evaluate the role of sensory feedback from sideways tip/tissue interaction forces with a scale factor of 100% in characterising tissues of varying stiffness. Twenty human subjects were involved in the experiments for at least 1440 trials. Friedman and Wilcoxon signed-rank tests were employed to statistically analyse the experimental results. Providing realistic force feedback in robotic assisted surgery systems improves the quality of tissue characterization procedures. Force feedback capability also increases the certainty of characterizing soft tissues compared with direct palpation using the lateral sides of index fingers. The force feedback capability can improve the quality of palpation and characterization of soft tissues of varying stiffness by restoring sense of touch in robotic assisted minimally invasive surgery operations.

  6. The effect of problem posing and problem solving with realistic mathematics education approach to the conceptual understanding and adaptive reasoning

    Science.gov (United States)

    Mahendra, Rengga; Slamet, Isnandar; Budiyono

    2017-12-01

    One of the difficulties of students in learning mathematics is on the subject of geometry that requires students to understand abstract things. The aim of this research is to determine the effect of learning model Problem Posing and Problem Solving with Realistic Mathematics Education Approach to conceptual understanding and students' adaptive reasoning in learning mathematics. This research uses a kind of quasi experimental research. The population of this research is all seventh grade students of Junior High School 1 Jaten, Indonesia. The sample was taken using stratified cluster random sampling technique. The test of the research hypothesis was analyzed by using t-test. The results of this study indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students' conceptual understanding significantly in mathematics learning. In addition tu, the results also showed that the model of Problem Solving learning with Realistic Mathematics Education Approach can improve students' adaptive reasoning significantly in learning mathematics. Therefore, the model of Problem Posing and Problem Solving learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on the subject of geometry so as to improve conceptual understanding and students' adaptive reasoning. Furthermore, the impact can improve student achievement.

  7. Age structure and mortality of walleyes in Kansas reservoirs: Use of mortality caps to establish realistic management objectives

    Science.gov (United States)

    Quist, M.C.; Stephen, J.L.; Guy, C.S.; Schultz, R.D.

    2004-01-01

    Age structure, total annual mortality, and mortality caps (maximum mortality thresholds established by managers) were investigated for walleye Sander vitreus (formerly Stizostedion vitreum) populations sampled from eight Kansas reservoirs during 1991-1999. We assessed age structure by examining the relative frequency of different ages in the population; total annual mortality of age-2 and older walleyes was estimated by use of a weighted catch curve. To evaluate the utility of mortality caps, we modeled threshold values of mortality by varying growth rates and management objectives. Estimated mortality thresholds were then compared with observed growth and mortality rates. The maximum age of walleyes varied from 5 to 11 years across reservoirs. Age structure was dominated (???72%) by walleyes age 3 and younger in all reservoirs, corresponding to ages that were not yet vulnerable to harvest. Total annual mortality rates varied from 40.7% to 59.5% across reservoirs and averaged 51.1% overall (SE = 2.3). Analysis of mortality caps indicated that a management objective of 500 mm for the mean length of walleyes harvested by anglers was realistic for all reservoirs with a 457-mm minimum length limit but not for those with a 381-mm minimum length limit. For a 500-mm mean length objective to be realized for reservoirs with a 381-mm length limit, managers must either reduce mortality rates (e.g., through restrictive harvest regulations) or increase growth of walleyes. When the assumed objective was to maintain the mean length of harvested walleyes at current levels, the observed annual mortality rates were below the mortality cap for all reservoirs except one. Mortality caps also provided insight on management objectives expressed in terms of proportional stock density (PSD). Results indicated that a PSD objective of 20-40 was realistic for most reservoirs. This study provides important walleye mortality information that can be used for monitoring or for inclusion into

  8. An example of population-level risk assessments for small mammals using individual-based population models

    DEFF Research Database (Denmark)

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn

    2016-01-01

    effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report...... assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios....

  9. Separable expansion for realistic multichannel scattering problems

    International Nuclear Information System (INIS)

    Canton, L.; Cattapan, G.; Pisent, G.

    1987-01-01

    A new approach to the multichannel scattering problem with realistic local or nonlocal interactions is developed. By employing the negative-energy solutions of uncoupled Sturmian eigenvalue problems referring to simple auxiliary potentials, the coupling interactions appearing to the original multichannel problem are approximated by finite-rank potentials. By resorting to integral-equation tecniques the coupled-channel equations are then reduced to linear algebraic equations which can be straightforwardly solved. Compact algebraic expressions for the relevant scattering matrix elements are thus obtained. The convergence of the method is tasted in the single-channel case with realistic optical potentials. Excellent agreement is obtained with a few terms in the separable expansion for both real and absorptive interactions

  10. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    Science.gov (United States)

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013

  11. Acute exposure to realistic acid fog: effects on respiratory function and airway responsiveness in asthmatics.

    OpenAIRE

    Leduc, Dimitri; Fally, Sophie; De Vuyst, Paul; Wollast, Roland; Yernault, Jean Claude

    1995-01-01

    Naturally occurring fogs in industrialized cities are contaminated by acidic air pollutants. In Brussels, Belgium, the pH of polluted fogwater may be as low as 3 with osmolarity as low as 30 mOsm. In order to explore short-term respiratory effects of a realistic acid-polluted fog, we collected samples of acid fog in Brussels, Belgium, which is a densely populated and industrialized city, we defined characteristics of this fog and exposed asthmatic volunteers at rest through a face mask to fog...

  12. Larval dispersal modeling of pearl oyster Pinctada margaritifera following realistic environmental and biological forcing in Ahe atoll lagoon.

    Directory of Open Access Journals (Sweden)

    Yoann Thomas

    Full Text Available Studying the larval dispersal of bottom-dwelling species is necessary to understand their population dynamics and optimize their management. The black-lip pearl oyster (Pinctada margaritifera is cultured extensively to produce black pearls, especially in French Polynesia's atoll lagoons. This aquaculture relies on spat collection, a process that can be optimized by understanding which factors influence larval dispersal. Here, we investigate the sensitivity of P. margaritifera larval dispersal kernel to both physical and biological factors in the lagoon of Ahe atoll. Specifically, using a validated 3D larval dispersal model, the variability of lagoon-scale connectivity is investigated against wind forcing, depth and location of larval release, destination location, vertical swimming behavior and pelagic larval duration (PLD factors. The potential connectivity was spatially weighted according to both the natural and cultivated broodstock densities to provide a realistic view of connectivity. We found that the mean pattern of potential connectivity was driven by the southwest and northeast main barotropic circulation structures, with high retention levels in both. Destination locations, spawning sites and PLD were the main drivers of potential connectivity, explaining respectively 26%, 59% and 5% of the variance. Differences between potential and realistic connectivity showed the significant contribution of the pearl oyster broodstock location to its own dynamics. Realistic connectivity showed larger larval supply in the western destination locations, which are preferentially used by farmers for spat collection. In addition, larval supply in the same sectors was enhanced during summer wind conditions. These results provide new cues to understanding the dynamics of bottom-dwelling populations in atoll lagoons, and show how to take advantage of numerical models for pearl oyster management.

  13. CPT invariance and the spin-statistics connection

    CERN Document Server

    Bain, Jonathan

    2016-01-01

    This book seeks to answer the question "What explains CPT invariance and the spin-statistics connection?" These properties play foundational roles in relativistic quantum field theories (RQFTs), are supported by high-precision experiments, and figure into explanations of a wide range of phenomena, from antimatter, to the periodic table of the elements, to superconductors and superfluids. They can be derived in RQFTs by means of the famous CPT and Spin-Statistics theorems; but, the author argues, these theorems cannot be said to explain these properties, at least under standard philosophical accounts of scientific explanation. This is because there are multiple, in some cases incompatible, ways of deriving these theorems, and, secondly, because the theorems fail for the types of theories that underwrite the empirical evidence: non-relativistic quantum theories, and realistic interacting RQFTs. The goal of this book is to work towards an understanding of CPT invariance and the spin-statistics connection by firs...

  14. The effects of dynamics on statistical emission

    International Nuclear Information System (INIS)

    Friedman, W.A.

    1989-01-01

    The dynamical processes which occur during the disassembly of an excited nuclear system influence predictions arising from a statistical treatment of the decay of that system. Changes, during the decay period, in such collective properties as angular momentum, density, and kinetic energy of the emitting source affect both the mass and energy spectra of the emitted fragments. This influence will be examined. The author will explore the influence of nuclear compressibility on the decay process, in order to determine what information can be learned about this property from the products of decay. He will compare the relationship between disparate scenarios of decay: a succession of binary decays, each governed by statistics; and a full microcanonical distribution at a single freeze-out density. The author hopes to learn from the general nature of these two statistical predictions when one or the other might be more realistic, and what signatures resulting from the two models might be used to determine which accounts best for specific experimental results

  15. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  16. Reconciling Dwarf Galaxies with ΛCDM Cosmology: Simulating A Realistic Population of Satellites Around a Milky Way-Mass Galaxy

    OpenAIRE

    Wetzel, Andrew R.; Hopkins, Philip F.; Kim, Ji-Hoon; Faucher-Giguère, Claude-André; Kereš, Dušan; Quataert, Eliot

    2016-01-01

    � 2016. The American Astronomical Society. All rights reserved. Low-mass "dwarf" galaxies represent the most significant challenges to the cold dark matter (CDM) model of cosmological structure formation. Because these faint galaxies are (best) observed within the Local Group (LG) of the Milky Way (MW) and Andromeda (M31), understanding their formation in such an environment is critical. We present first results from the Latte Project: the Milky Way on Feedback in Realistic Environments (FI...

  17. A Statistical Assessment of Demographic Bonus towards Poverty Alleviation

    Directory of Open Access Journals (Sweden)

    Jamal Abdul Nasir

    2011-09-01

    Full Text Available The shift of birth and death rates from high to low level in any population is referred as demographic transition. Mechanically, the transition of a society creates more working member of its own population commonly called demographic bonus. This articleempirically explores the realistic soundness of demographic bonus in reducing the poverty level of the society. Three contrasting regions namely Eastern Asia, Central America and Oceania were selected for analytical purposes. The findings indicate that Eastern Asia and Oceania are currently facing the end of their transition whereas theCentral America is lagged behind in transition. Central America due to last runner in transition race is the sustained recipient of its own demographic bonus by the year 2030.On the basis of three mechanisms namely: labour supply, savings and human capital, the Eastern Asian region is found to be successful beneficiary of its own demographic gift which concludes that many million people have escaped from poverty. Under the right policy environment on the above three mechanisms, Eastern Asia experience indicates the realistic contribution of demographic bonus to reduce poverty.

  18. Neutron star models with realistic high-density equations of state

    International Nuclear Information System (INIS)

    Malone, R.C.; Johnson, M.B.; Bethe, H.A.

    1975-01-01

    We calculate neutron star models using four realistic high-density models of the equation of state. We conclude that the maximum mass of a neutron star is unlikely to exceed 2 M/sub sun/. All of the realistic models are consistent with current estimates of the moment of inertia of the Crab pulsar

  19. Estimating demographic parameters from large-scale population genomic data using Approximate Bayesian Computation

    Directory of Open Access Journals (Sweden)

    Li Sen

    2012-03-01

    Full Text Available Abstract Background The Approximate Bayesian Computation (ABC approach has been used to infer demographic parameters for numerous species, including humans. However, most applications of ABC still use limited amounts of data, from a small number of loci, compared to the large amount of genome-wide population-genetic data which have become available in the last few years. Results We evaluated the performance of the ABC approach for three 'population divergence' models - similar to the 'isolation with migration' model - when the data consists of several hundred thousand SNPs typed for multiple individuals by simulating data from known demographic models. The ABC approach was used to infer demographic parameters of interest and we compared the inferred values to the true parameter values that was used to generate hypothetical "observed" data. For all three case models, the ABC approach inferred most demographic parameters quite well with narrow credible intervals, for example, population divergence times and past population sizes, but some parameters were more difficult to infer, such as population sizes at present and migration rates. We compared the ability of different summary statistics to infer demographic parameters, including haplotype and LD based statistics, and found that the accuracy of the parameter estimates can be improved by combining summary statistics that capture different parts of information in the data. Furthermore, our results suggest that poor choices of prior distributions can in some circumstances be detected using ABC. Finally, increasing the amount of data beyond some hundred loci will substantially improve the accuracy of many parameter estimates using ABC. Conclusions We conclude that the ABC approach can accommodate realistic genome-wide population genetic data, which may be difficult to analyze with full likelihood approaches, and that the ABC can provide accurate and precise inference of demographic parameters from

  20. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    Science.gov (United States)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  1. Bell Operator Method to Classify Local Realistic Theories

    International Nuclear Information System (INIS)

    Nagata, Koji

    2010-01-01

    We review the historical fact of multipartite Bell inequalities with an arbitrary number of settings. An explicit local realistic model for the values of a correlation function, given in a two-setting Bell experiment (two-setting model), works only for the specific set of settings in the given experiment, but cannot construct a local realistic model for the values of a correlation function, given in a continuous-infinite settings Bell experiment (infinite-setting model), even though there exist two-setting models for all directions in space. Hence, the two-setting model does not have the property that the infinite-setting model has. Here, we show that an explicit two-setting model cannot construct a local realistic model for the values of a correlation function, given in an M-setting Bell experiment (M-setting model), even though there exist two-setting models for the M measurement directions chosen in the given M-setting experiment. Hence, the two-setting model does not have the property that the M-setting model has. (general)

  2. Agent-based modeling traction force mediated compaction of cell-populated collagen gels using physically realistic fibril mechanics.

    Science.gov (United States)

    Reinhardt, James W; Gooch, Keith J

    2014-02-01

    Agent-based modeling was used to model collagen fibrils, composed of a string of nodes serially connected by links that act as Hookean springs. Bending mechanics are implemented as torsional springs that act upon each set of three serially connected nodes as a linear function of angular deflection about the central node. These fibrils were evaluated under conditions that simulated axial extension, simple three-point bending and an end-loaded cantilever. The deformation of fibrils under axial loading varied <0.001% from the analytical solution for linearly elastic fibrils. For fibrils between 100 μm and 200 μm in length experiencing small deflections, differences between simulated deflections and their analytical solutions were <1% for fibrils experiencing three-point bending and <7% for fibrils experiencing cantilever bending. When these new rules for fibril mechanics were introduced into a model that allowed for cross-linking of fibrils to form a network and the application of cell traction force, the fibrous network underwent macroscopic compaction and aligned between cells. Further, fibril density increased between cells to a greater extent than that observed macroscopically and appeared similar to matrical tracks that have been observed experimentally in cell-populated collagen gels. This behavior is consistent with observations in previous versions of the model that did not allow for the physically realistic simulation of fibril mechanics. The significance of the torsional spring constant value was then explored to determine its impact on remodeling of the simulated fibrous network. Although a stronger torsional spring constant reduced the degree of quantitative remodeling that occurred, the inclusion of torsional springs in the model was not necessary for the model to reproduce key qualitative aspects of remodeling, indicating that the presence of Hookean springs is essential for this behavior. These results suggest that traction force mediated matrix

  3. Full dose reduction potential of statistical iterative reconstruction for head CT protocols in a predominantly pediatric population

    Science.gov (United States)

    Mirro, Amy E.; Brady, Samuel L.; Kaufman, Robert. A.

    2016-01-01

    Purpose To implement the maximum level of statistical iterative reconstruction that can be used to establish dose-reduced head CT protocols in a primarily pediatric population. Methods Select head examinations (brain, orbits, sinus, maxilla and temporal bones) were investigated. Dose-reduced head protocols using an adaptive statistical iterative reconstruction (ASiR) were compared for image quality with the original filtered back projection (FBP) reconstructed protocols in phantom using the following metrics: image noise frequency (change in perceived appearance of noise texture), image noise magnitude, contrast-to-noise ratio (CNR), and spatial resolution. Dose reduction estimates were based on computed tomography dose index (CTDIvol) values. Patient CTDIvol and image noise magnitude were assessed in 737 pre and post dose reduced examinations. Results Image noise texture was acceptable up to 60% ASiR for Soft reconstruction kernel (at both 100 and 120 kVp), and up to 40% ASiR for Standard reconstruction kernel. Implementation of 40% and 60% ASiR led to an average reduction in CTDIvol of 43% for brain, 41% for orbits, 30% maxilla, 43% for sinus, and 42% for temporal bone protocols for patients between 1 month and 26 years, while maintaining an average noise magnitude difference of 0.1% (range: −3% to 5%), improving CNR of low contrast soft tissue targets, and improving spatial resolution of high contrast bony anatomy, as compared to FBP. Conclusion The methodology in this study demonstrates a methodology for maximizing patient dose reduction and maintaining image quality using statistical iterative reconstruction for a primarily pediatric population undergoing head CT examination. PMID:27056425

  4. Blend Shape Interpolation and FACS for Realistic Avatar

    Science.gov (United States)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  5. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana; Ait Abderrahmane, Hamid; Upadhyay, Ranjit Kumar; Kumari, Nitu

    2013-01-01

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  6. Finite Time Blowup in a Realistic Food-Chain Model

    KAUST Repository

    Parshad, Rana

    2013-05-19

    We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.

  7. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  8. Fatigue - determination of a more realistic usage factor

    International Nuclear Information System (INIS)

    Lang, H.

    2001-01-01

    The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application

  9. DESCARTES' RULE OF SIGNS AND THE IDENTIFIABILITY OF POPULATION DEMOGRAPHIC MODELS FROM GENOMIC VARIATION DATA.

    Science.gov (United States)

    Bhaskar, Anand; Song, Yun S

    2014-01-01

    The sample frequency spectrum (SFS) is a widely-used summary statistic of genomic variation in a sample of homologous DNA sequences. It provides a highly efficient dimensional reduction of large-scale population genomic data and its mathematical dependence on the underlying population demography is well understood, thus enabling the development of efficient inference algorithms. However, it has been recently shown that very different population demographies can actually generate the same SFS for arbitrarily large sample sizes. Although in principle this nonidentifiability issue poses a thorny challenge to statistical inference, the population size functions involved in the counterexamples are arguably not so biologically realistic. Here, we revisit this problem and examine the identifiability of demographic models under the restriction that the population sizes are piecewise-defined where each piece belongs to some family of biologically-motivated functions. Under this assumption, we prove that the expected SFS of a sample uniquely determines the underlying demographic model, provided that the sample is sufficiently large. We obtain a general bound on the sample size sufficient for identifiability; the bound depends on the number of pieces in the demographic model and also on the type of population size function in each piece. In the cases of piecewise-constant, piecewise-exponential and piecewise-generalized-exponential models, which are often assumed in population genomic inferences, we provide explicit formulas for the bounds as simple functions of the number of pieces. Lastly, we obtain analogous results for the "folded" SFS, which is often used when there is ambiguity as to which allelic type is ancestral. Our results are proved using a generalization of Descartes' rule of signs for polynomials to the Laplace transform of piecewise continuous functions.

  10. Characterization of Strong Light-Matter Coupling in Semiconductor Quantum-Dot Microcavities via Photon-Statistics Spectroscopy

    Science.gov (United States)

    Schneebeli, L.; Kira, M.; Koch, S. W.

    2008-08-01

    It is shown that spectrally resolved photon-statistics measurements of the resonance fluorescence from realistic semiconductor quantum-dot systems allow for high contrast identification of the two-photon strong-coupling states. Using a microscopic theory, the second-rung resonance of Jaynes-Cummings ladder is analyzed and optimum excitation conditions are determined. The computed photon-statistics spectrum displays gigantic, experimentally robust resonances at the energetic positions of the second-rung emission.

  11. Realistic rhetoric and legal decision

    Directory of Open Access Journals (Sweden)

    João Maurício Adeodato

    2017-06-01

    Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.

  12. Asymmetric beams and CMB statistical anisotropy

    International Nuclear Information System (INIS)

    Hanson, Duncan; Lewis, Antony; Challinor, Anthony

    2010-01-01

    Beam asymmetries result in statistically anisotropic cosmic microwave background (CMB) maps. Typically, they are studied for their effects on the CMB power spectrum, however they more closely mimic anisotropic effects such as gravitational lensing and primordial power asymmetry. We discuss tools for studying the effects of beam asymmetry on general quadratic estimators of anisotropy, analytically for full-sky observations as well as in the analysis of realistic data. We demonstrate this methodology in application to a recently detected 9σ quadrupolar modulation effect in the WMAP data, showing that beams provide a complete and sufficient explanation for the anomaly.

  13. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  14. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  15. On the Realistic Stochastic Model of GPS Observables: Implementation and Performance

    Science.gov (United States)

    Zangeneh-Nejad, F.; Amiri-Simkooei, A. R.; Sharifi, M. A.; Asgari, J.

    2015-12-01

    High-precision GPS positioning requires a realistic stochastic model of observables. A realistic GPS stochastic model of observables should take into account different variances for different observation types, correlations among different observables, the satellite elevation dependence of observables precision, and the temporal correlation of observables. Least-squares variance component estimation (LS-VCE) is applied to GPS observables using the geometry-based observation model (GBOM). To model the satellite elevation dependent of GPS observables precision, an exponential model depending on the elevation angles of the satellites are also employed. Temporal correlation of the GPS observables is modelled by using a first-order autoregressive noise model. An important step in the high-precision GPS positioning is double difference integer ambiguity resolution (IAR). The fraction or percentage of success among a number of integer ambiguity fixing is called the success rate. A realistic estimation of the GNSS observables covariance matrix plays an important role in the IAR. We consider the ambiguity resolution success rate for two cases, namely a nominal and a realistic stochastic model of the GPS observables using two GPS data sets collected by the Trimble R8 receiver. The results confirm that applying a more realistic stochastic model can significantly improve the IAR success rate on individual frequencies, either on L1 or on L2. An improvement of 20% was achieved to the empirical success rate results. The results also indicate that introducing the realistic stochastic model leads to a larger standard deviation for the baseline components by a factor of about 2.6 on the data sets considered.

  16. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  17. Heart Disease and Stroke Statistics

    Science.gov (United States)

    ... Media for Heart.org Heart and Stroke Association Statistics Each year, the American Heart Association, in conjunction ... health and disease in the population. Heart & Stroke Statistics FAQs What is Prevalence? Prevalence is an estimate ...

  18. Toxicity effects of an environmental realistic herbicide mixture on the seagrass Zostera noltei.

    Science.gov (United States)

    Diepens, Noël J; Buffan-Dubau, Evelyne; Budzinski, Hélène; Kallerhoff, Jean; Merlina, Georges; Silvestre, Jérome; Auby, Isabelle; Nathalie Tapie; Elger, Arnaud

    2017-03-01

    Worldwide seagrass declines have been observed due to multiple stressors. One of them is the mixture of pesticides used in intensive agriculture and boat antifouling paints in coastal areas. Effects of mixture toxicity are complex and poorly understood. However, consideration of mixture toxicity is more realistic and ecologically relevant for environmental risk assessment (ERA). The first aim of this study was to determine short-term effects of realistic herbicide mixture exposure on physiological endpoints of Zostera noltei. The second aim was to assess the environmental risks of this mixture, by comparing the results to previously published data. Z. noltei was exposed to a mixture of four herbicides: atrazine, diuron, irgarol and S-metolachlor, simulating the composition of typical cocktail of contaminants in the Arcachon bay (Atlantic coast, France). Three stress biomarkers were measured: enzymatic activity of glutathione reductase, effective quantum yield (EQY) and photosynthetic pigment composition after 6, 24 and 96 h. Short term exposure to realistic herbicide mixtures affected EQY, with almost 100% inhibition for the two highest concentrations, and photosynthetic pigments. Effect on pigment composition was detected after 6 h with a no observed effect concentration (NOEC) of 1 μg/L total mixture concentration. The lowest EQY effect concentration at 10% (EC 10 ) (2 μg/L) and pigment composition NOEC with an assessment factor of 10 were above the maximal field concentrations along the French Atlantic coast, suggesting that there are no potential short term adverse effects of this particular mixture on Z. noltei. However, chronic effects on photosynthesis may lead to reduced energy reserves, which could thus lead to effects at whole plant and population level. Understanding the consequences of chemical mixtures could help to improve ERA and enhance management strategies to prevent further declines of seagrass meadows worldwide. Copyright © 2016

  19. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated...... characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human beings....... Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept is then gaining consensus...

  20. Performance Analysis of Relays in LTE for a Realistic Suburban Deployment Scenario

    DEFF Research Database (Denmark)

    Coletti, Claudio; Mogensen, Preben; Irmer, Ralf

    2011-01-01

    Relays are likely to play an important role in the deployment of Beyond 3G networks, such as LTE-Advanced, thanks to the possibility of effectively extending Macro network coverage and fulfilling the expected high data-rate requirements. Up until now, the relay technology potential and its cost......-effectiveness have been widely investigated in the literature, considering mainly statistical deployment scenarios, like regular networks with uniform traffic distribution. This paper is envisaged to illustrate the performances of different relay technologies (In-Band/Out-band) in a realistic suburban network...... scenario with real Macro site positions, user density map and spectrum band availability. Based on a proposed heuristic deployment algorithm, results show that deploying In-band relays can significantly reduce the user outage if high backhaul link quality is ensured, whereas Out-band relaying and the usage...

  1. I-Love relations for incompressible stars and realistic stars

    Science.gov (United States)

    Chan, T. K.; Chan, AtMa P. O.; Leung, P. T.

    2015-02-01

    In spite of the diversity in the equations of state of nuclear matter, the recently discovered I-Love-Q relations [Yagi and Yunes, Science 341, 365 (2013), 10.1126/science.1236462], which relate the moment of inertia, tidal Love number (deformability), and the spin-induced quadrupole moment of compact stars, hold for various kinds of realistic neutron stars and quark stars. While the physical origin of such universality is still a current issue, the observation that the I-Love-Q relations of incompressible stars can well approximate those of realistic compact stars hints at a new direction to approach the problem. In this paper, by establishing recursive post-Minkowskian expansion for the moment of inertia and the tidal deformability of incompressible stars, we analytically derive the I-Love relation for incompressible stars and show that the so-obtained formula can be used to accurately predict the behavior of realistic compact stars from the Newtonian limit to the maximum mass limit.

  2. Introductory statistics for the behavioral sciences

    CERN Document Server

    Welkowitz, Joan; Cohen, Jacob

    1971-01-01

    Introductory Statistics for the Behavioral Sciences provides an introduction to statistical concepts and principles. This book emphasizes the robustness of parametric procedures wherein such significant tests as t and F yield accurate results even if such assumptions as equal population variances and normal population distributions are not well met.Organized into three parts encompassing 16 chapters, this book begins with an overview of the rationale upon which much of behavioral science research is based, namely, drawing inferences about a population based on data obtained from a samp

  3. Hyper-realistic face masks: a new challenge in person identification.

    Science.gov (United States)

    Sanders, Jet Gabrielle; Ueda, Yoshiyuki; Minemoto, Kazusa; Noyes, Eilidh; Yoshikawa, Sakiko; Jenkins, Rob

    2017-01-01

    We often identify people using face images. This is true in occupational settings such as passport control as well as in everyday social environments. Mapping between images and identities assumes that facial appearance is stable within certain bounds. For example, a person's apparent age, gender and ethnicity change slowly, if at all. It also assumes that deliberate changes beyond these bounds (i.e., disguises) would be easy to spot. Hyper-realistic face masks overturn these assumptions by allowing the wearer to look like an entirely different person. If unnoticed, these masks break the link between facial appearance and personal identity, with clear implications for applied face recognition. However, to date, no one has assessed the realism of these masks, or specified conditions under which they may be accepted as real faces. Herein, we examined incidental detection of unexpected but attended hyper-realistic masks in both photographic and live presentations. Experiment 1 (UK; n = 60) revealed no evidence for overt detection of hyper-realistic masks among real face photos, and little evidence of covert detection. Experiment 2 (Japan; n = 60) extended these findings to different masks, mask-wearers and participant pools. In Experiment 3 (UK and Japan; n = 407), passers-by failed to notice that a live confederate was wearing a hyper-realistic mask and showed limited evidence of covert detection, even at close viewing distance (5 vs. 20 m). Across all of these studies, viewers accepted hyper-realistic masks as real faces. Specific countermeasures will be required if detection rates are to be improved.

  4. Shot Group Statistics for Small Arms Applications

    Science.gov (United States)

    2017-06-01

    if its probability distribution is known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown... statistical inference on the unknown, population standard deviations of the x and y impact-point positions. The dispersion measures treated in this report...known with sufficient accuracy, then it can be used to make a sound statistical inference on the unknown, population standard deviations of the x and y

  5. Putting a Realistic Theory of Mind into Agency Theory

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stea, Diego

    2014-01-01

    Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management...

  6. Realistic searches on stretched exponential networks

    Indian Academy of Sciences (India)

    We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed.

  7. Order-specific fertility estimates based on perinatal statistics and statistics on out-of-hospital births

    OpenAIRE

    Kreyenfeld, Michaela; Peters, Frederik; Scholz, Rembrandt; Wlosnewski, Ines

    2014-01-01

    Until 2008, German vital statistics has not provided information on biological birth order. We have tried to close part of this gap by providing order-specific fertility rates generated from Perinatal Statistics and statistics on out-of-hospital births for the period 2001-2008. This investigation has been published in Comparative Population Studies (CPoS) (see Kreyenfeld, Scholz, Peters and Wlosnewski 2010). The CPoS-paper describes how data from the Perinatal Statistics and statistics on out...

  8. Results of recent calculations using realistic potentials

    International Nuclear Information System (INIS)

    Friar, J.L.

    1987-01-01

    Results of recent calculations for the triton using realistic potentials with strong tensor forces are reviewed, with an emphasis on progress made using the many different calculational schemes. Several test problems are suggested. 49 refs., 5 figs

  9. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....

  10. Hierarchical statistical modeling of xylem vulnerability to cavitation.

    Science.gov (United States)

    Ogle, Kiona; Barber, Jarrett J; Willson, Cynthia; Thompson, Brenda

    2009-01-01

    Cavitation of xylem elements diminishes the water transport capacity of plants, and quantifying xylem vulnerability to cavitation is important to understanding plant function. Current approaches to analyzing hydraulic conductivity (K) data to infer vulnerability to cavitation suffer from problems such as the use of potentially unrealistic vulnerability curves, difficulty interpreting parameters in these curves, a statistical framework that ignores sampling design, and an overly simplistic view of uncertainty. This study illustrates how two common curves (exponential-sigmoid and Weibull) can be reparameterized in terms of meaningful parameters: maximum conductivity (k(sat)), water potential (-P) at which percentage loss of conductivity (PLC) =X% (P(X)), and the slope of the PLC curve at P(X) (S(X)), a 'sensitivity' index. We provide a hierarchical Bayesian method for fitting the reparameterized curves to K(H) data. We illustrate the method using data for roots and stems of two populations of Juniperus scopulorum and test for differences in k(sat), P(X), and S(X) between different groups. Two important results emerge from this study. First, the Weibull model is preferred because it produces biologically realistic estimates of PLC near P = 0 MPa. Second, stochastic embolisms contribute an important source of uncertainty that should be included in such analyses.

  11. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...

  12. Getting realistic; Endstation Demut

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2004-01-28

    The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)

  13. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  14. Two-Capacitor Problem: A More Realistic View.

    Science.gov (United States)

    Powell, R. A.

    1979-01-01

    Discusses the two-capacitor problem by considering the self-inductance of the circuit used and by determining how well the usual series RC circuit approximates the two-capacitor problem when realistic values of L, C, and R are chosen. (GA)

  15. Farm Population of the United States: 1972. Current Population Reports: Farm Population.

    Science.gov (United States)

    Bureau of the Census (DOC), Suitland, MD. Population Div.

    Based on data derived from the Current Population Survey of the Bureau of Census, this statistical report presents demographic and labor force characteristics of the U.S. farm population and comparisons of the farm and nonfarm populations. Tabular data are presented as follows: (1) U.S. Population, Total and Farm: April 1960 to 1972; (2) Persons…

  16. Farm Population of the United States: 1974. Current Population Reports, Farm Population.

    Science.gov (United States)

    Banks, Vera J.; And Others

    Based on data derived primarily from the Current Population Survey of the Bureau of the Census, this statistical report presents demographic and labor force characteristics of the U.S. farm population and a comparison of selected characteristics of the farm and nonfarm population. Tabular data are presented as follows: (1) Population of the U.S.,…

  17. Farm Population of the United States: 1971. Current Population Reports: Farm Population.

    Science.gov (United States)

    Bureau of the Census (DOC), Suitland, MD. Population Div.

    Based on data derived from the Current Population Survey of the Bureau of the Census, this statistical report presents demographic and labor force characteristics of the U.S. farm population and comparisons of the farm and nonfarm populations. Tabular data are presented as follows: (1) U.S. Population, Total and Farm: April 1960 and 1971; (2)…

  18. Magnus forces and statistics in 2 + 1 dimensions

    International Nuclear Information System (INIS)

    Davis, R.L.

    1990-01-01

    Spinning vortex solutions to the abelian Higgs model, not Nielsen-Olesen solutions, are appropriate to a Ginzburg-Landau description of superconductivity. The main physical distinction is that spinning vortices experience the Magnus force while Nielsen-Olesen vortices do not. In 2 + 1 dimensional superconductivity without a Chern-Simons interaction, the effect of the Magnus force is equivalent to that of a background fictitious magnetic field. Moreover, the phase obtained an interchanging two quasi-particles is always path-dependent. When a Chern-Simons term is added there is an additional localized Magnus flux at the vortex. For point-like vortices, the Chern-Simons interaction can be seen as defining their intrinsic statistics, but in realistic cases of vortices with finite size in strong Magnus fields the quasi-particle statistics are not well-defined

  19. Satellite Maps Deliver More Realistic Gaming

    Science.gov (United States)

    2013-01-01

    When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.

  20. Implementing a generic method for bias correction in statistical models using random effects, with spatial and population dynamics examples

    DEFF Research Database (Denmark)

    Thorson, James T.; Kristensen, Kasper

    2016-01-01

    Statistical models play an important role in fisheries science when reconciling ecological theory with available data for wild populations or experimental studies. Ecological models increasingly include both fixed and random effects, and are often estimated using maximum likelihood techniques...... configurations of an age-structured population dynamics model. This simulation experiment shows that the epsilon-method and the existing bias-correction method perform equally well in data-rich contexts, but the epsilon-method is slightly less biased in data-poor contexts. We then apply the epsilon......-method to a spatial regression model when estimating an index of population abundance, and compare results with an alternative bias-correction algorithm that involves Markov-chain Monte Carlo sampling. This example shows that the epsilon-method leads to a biologically significant difference in estimates of average...

  1. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  2. NASA Orbital Debris Baseline Populations

    Science.gov (United States)

    Krisko, Paula H.; Vavrin, A. B.

    2013-01-01

    The NASA Orbital Debris Program Office has created high fidelity populations of the debris environment. The populations include objects of 1 cm and larger in Low Earth Orbit through Geosynchronous Transfer Orbit. They were designed for the purpose of assisting debris researchers and sensor developers in planning and testing. This environment is derived directly from the newest ORDEM model populations which include a background derived from LEGEND, as well as specific events such as the Chinese ASAT test, the Iridium 33/Cosmos 2251 accidental collision, the RORSAT sodium-potassium droplet releases, and other miscellaneous events. It is the most realistic ODPO debris population to date. In this paper we present the populations in chart form. We describe derivations of the background population and the specific populations added on. We validate our 1 cm and larger Low Earth Orbit population against SSN, Haystack, and HAX radar measurements.

  3. Statistical methods in the mechanical design of fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)

  4. From inverse problems to learning: a Statistical Mechanics approach

    Science.gov (United States)

    Baldassi, Carlo; Gerace, Federica; Saglietti, Luca; Zecchina, Riccardo

    2018-01-01

    We present a brief introduction to the statistical mechanics approaches for the study of inverse problems in data science. We then provide concrete new results on inferring couplings from sampled configurations in systems characterized by an extensive number of stable attractors in the low temperature regime. We also show how these result are connected to the problem of learning with realistic weak signals in computational neuroscience. Our techniques and algorithms rely on advanced mean-field methods developed in the context of disordered systems.

  5. An example of population-level risk assessments for small mammals using individual-based population models.

    Science.gov (United States)

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus

    2016-01-01

    This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.

  6. Population-based statistical inference for temporal sequence of somatic mutations in cancer genomes.

    Science.gov (United States)

    Rhee, Je-Keun; Kim, Tae-Min

    2018-04-20

    It is well recognized that accumulation of somatic mutations in cancer genomes plays a role in carcinogenesis; however, the temporal sequence and evolutionary relationship of somatic mutations remain largely unknown. In this study, we built a population-based statistical framework to infer the temporal sequence of acquisition of somatic mutations. Using the model, we analyzed the mutation profiles of 1954 tumor specimens across eight tumor types. As a result, we identified tumor type-specific directed networks composed of 2-15 cancer-related genes (nodes) and their mutational orders (edges). The most common ancestors identified in pairwise comparison of somatic mutations were TP53 mutations in breast, head/neck, and lung cancers. The known relationship of KRAS to TP53 mutations in colorectal cancers was identified, as well as potential ancestors of TP53 mutation such as NOTCH1, EGFR, and PTEN mutations in head/neck, lung and endometrial cancers, respectively. We also identified apoptosis-related genes enriched with ancestor mutations in lung cancers and a relationship between APC hotspot mutations and TP53 mutations in colorectal cancers. While evolutionary analysis of cancers has focused on clonal versus subclonal mutations identified in individual genomes, our analysis aims to further discriminate ancestor versus descendant mutations in population-scale mutation profiles that may help select cancer drivers with clinical relevance.

  7. Realist cinema as world cinema

    OpenAIRE

    Nagib, Lucia

    2017-01-01

    The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...

  8. Applied statistics for civil and environmental engineers

    CERN Document Server

    Kottegoda, N T

    2009-01-01

    Civil and environmental engineers need an understanding of mathematical statistics and probability theory to deal with the variability that affects engineers'' structures, soil pressures, river flows and the like. Students, too, need to get to grips with these rather difficult concepts.This book, written by engineers for engineers, tackles the subject in a clear, up-to-date manner using a process-orientated approach. It introduces the subjects of mathematical statistics and probability theory, and then addresses model estimation and testing, regression and multivariate methods, analysis of extreme events, simulation techniques, risk and reliability, and economic decision making.325 examples and case studies from European and American practice are included and each chapter features realistic problems to be solved.For the second edition new sections have been added on Monte Carlo Markov chain modeling with details of practical Gibbs sampling, sensitivity analysis and aleatory and epistemic uncertainties, and co...

  9. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  10. Simulation of microarray data with realistic characteristics

    Directory of Open Access Journals (Sweden)

    Lehmussola Antti

    2006-07-01

    Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.

  11. The Effect of Realistic Mathematics Education Approach on Students' Achievement And Attitudes Towards Mathematics

    Directory of Open Access Journals (Sweden)

    Effandi Zakaria

    2017-02-01

    Full Text Available This study was conducted to determine the effect of Realistic Mathematics Education Approach on mathematics achievement and student attitudes towards mathematics. This study also sought determine the relationship between student achievement and attitudes towards mathematics. This study used a quasi-experimental design conducted on 61 high school students at SMA Unggul Sigli. Students were divided into two groups, the treatment group $(n = 30$ namely, the Realistic Mathematics Approach group (PMR and the control group $(n = 31$ namely, the traditional group. This study was conducted for six weeks. The instruments used in this study were the achievement test and the attitudes towards mathematics questionnaires. Data were analyzed using SPSS. To determine the difference in mean achievement and attitudes between the two groups, data were analyzed using one-way ANOVA test. The result showed significant differences between the Realistic Mathematics Approach and the traditional approach in terms of achievement. The study showed no significant difference between the Realistic Mathematics Approach and the traditional approach in term of attitudes towards mathematics. It can be concluded that the use of realistic mathematics education approach enhanced students' mathematics achievement, but not attitudes towards mathematics. The Realistic Mathematics Education Approach encourage students to participate actively in the teaching and learning of mathematics. Thus, Realistic Mathematics Education Approach is an appropriate methods to improve the quality of teaching and learning process.

  12. DESCARTES’ RULE OF SIGNS AND THE IDENTIFIABILITY OF POPULATION DEMOGRAPHIC MODELS FROM GENOMIC VARIATION DATA1

    Science.gov (United States)

    Bhaskar, Anand; Song, Yun S.

    2016-01-01

    The sample frequency spectrum (SFS) is a widely-used summary statistic of genomic variation in a sample of homologous DNA sequences. It provides a highly efficient dimensional reduction of large-scale population genomic data and its mathematical dependence on the underlying population demography is well understood, thus enabling the development of efficient inference algorithms. However, it has been recently shown that very different population demographies can actually generate the same SFS for arbitrarily large sample sizes. Although in principle this nonidentifiability issue poses a thorny challenge to statistical inference, the population size functions involved in the counterexamples are arguably not so biologically realistic. Here, we revisit this problem and examine the identifiability of demographic models under the restriction that the population sizes are piecewise-defined where each piece belongs to some family of biologically-motivated functions. Under this assumption, we prove that the expected SFS of a sample uniquely determines the underlying demographic model, provided that the sample is sufficiently large. We obtain a general bound on the sample size sufficient for identifiability; the bound depends on the number of pieces in the demographic model and also on the type of population size function in each piece. In the cases of piecewise-constant, piecewise-exponential and piecewise-generalized-exponential models, which are often assumed in population genomic inferences, we provide explicit formulas for the bounds as simple functions of the number of pieces. Lastly, we obtain analogous results for the “folded” SFS, which is often used when there is ambiguity as to which allelic type is ancestral. Our results are proved using a generalization of Descartes’ rule of signs for polynomials to the Laplace transform of piecewise continuous functions. PMID:28018011

  13. Realistic Noise Assessment and Strain Analysis of Iranian Permanent GPS Stations

    Science.gov (United States)

    Razeghi, S. M.; Amiri Simkooei, A. A.; Sharifi, M. A.

    2012-04-01

    To assess noise characteristics of Iranian Permanent GPS Stations (IPGS), northwestern part of this network namely Azerbaijan Continuous GPS Station (ACGS), was selected. For a realistic noise assessment it is required to model all deterministic signals of the GPS time series by means of least squares harmonic estimation (LS-HE) and derive all periodic behavior of the series. After taking all deterministic signals into account, the least squares variance component estimation (LS-VCE) is used to obtain a realistic noise model (white noise plus flicker noise) of the ACGS. For this purpose, one needs simultaneous GPS time series for which a multivariate noise assessment is applied. Having determined realistic noise model, a realistic strain analysis of the network is obtained for which one relies on the finite element methods. Finite element is now considered to be a new functional model and the new stochastic model is given based on the multivariate noise assessment using LS-VCE. The deformation rates of the components along with their full covariance matries are input to the strain analysis. Further, the results are also provided using a pure white noise model. The normalized strains for these two models show that the strain parameters derived from a realistic noise model are less significant than those derived from the white model. This could be either due to the short time span of the time series used or due to the intrinsic behavior of the strain parameters in the ACGS. Longer time series are required to further elaborate this issue.

  14. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap; Bae, Sangwon; Knauer, Christian; Lee, Mira; Shin, Chansu; Vigneron, Antoine E.

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing

  15. DHLAS: A web-based information system for statistical genetic analysis of HLA population data.

    Science.gov (United States)

    Thriskos, P; Zintzaras, E; Germenis, A

    2007-03-01

    DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.

  16. Realistic Visualization of Virtual Views and Virtual Cinema

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer...... generated characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human...... beings. Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept...

  17. Realistic and efficient 2D crack simulation

    Science.gov (United States)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  18. Analysis of Heterogeneous Networks with Dual Connectivity in a Realistic Urban Deployment

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Barcos, Sonia; Wang, Hua

    2015-01-01

    the performance in this realistic layout. Due to the uneven load distribution observed in realistic deployments, DC is able to provide fast load balancing gains also at relatively high load - and not only at low load as typically observed in 3GPP scenarios. For the same reason, the proposed cell selection...

  19. Monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm on permanent plots: sampling methods and statistical properties of data

    Science.gov (United States)

    A.R. Mason; H.G. Paul

    1994-01-01

    Procedures for monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm are recommended based on many years experience in sampling these species in eastern Oregon and Washington. It is shown that statistically reliable estimates of larval density can be made for a population by sampling host trees in a series of permanent plots in a...

  20. The costs, efects and cost-effetiveness of counteracting overweight on a population level.

    NARCIS (Netherlands)

    Bemelmans, W.; van Baal, P; Wendel-Vos, W.; Schuit, A.J.; Feskens, E.; Ament, A.; Hoogenveen, R.

    2008-01-01

    Objectives: To gain insight in realistic policy targets for overweight at a population level and the accompanying costs. Therefore, the effect on overweight prevalence was estimated of large scale implementation of a community intervention (applied to 90% of general population) and an intensive

  1. Realistic respiratory motion margins for external beam partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, Leigh; Quirk, Sarah [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Smith, Wendy L., E-mail: wendy.smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Department of Oncology, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)

    2015-09-15

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  2. Realistic respiratory motion margins for external beam partial breast irradiation

    International Nuclear Information System (INIS)

    Conroy, Leigh; Quirk, Sarah; Smith, Wendy L.

    2015-01-01

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  3. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  4. Non-statistical behavior of coupled optical systems

    International Nuclear Information System (INIS)

    Perez, G.; Pando Lambruschini, C.; Sinha, S.; Cerdeira, H.A.

    1991-10-01

    We study globally coupled chaotic maps modeling an optical system, and find clear evidence of non-statistical behavior: the mean square deviation (MSD) of the mean field saturates with respect to increase in the number of elements coupled, after a critical value, and its distribution is clearly non-Gaussian. We also find that the power spectrum of the mean field displays well defined peaks, indicating a subtle coherence among different elements, even in the ''turbulent'' phase. This system is a physically realistic model that may be experimentally realizable. It is also a higher dimensional example (as each individual element is given by a complex map). Its study confirms that the phenomena observed in a wide class of coupled one-dimensional maps are present here as well. This gives more evidence to believe that such non-statistical behavior is probably generic in globally coupled systems. We also investigate the influence of parametric fluctuations on the MSD. (author). 10 refs, 7 figs, 1 tab

  5. Statistical Methods for Population Genetic Inference Based on Low-Depth Sequencing Data from Modern and Ancient DNA

    DEFF Research Database (Denmark)

    Korneliussen, Thorfinn Sand

    Due to the recent advances in DNA sequencing technology genomic data are being generated at an unprecedented rate and we are gaining access to entire genomes at population level. The technology does, however, not give direct access to the genetic variation and the many levels of preprocessing...... that is required before being able to make inferences from the data introduces multiple levels of uncertainty, especially for low-depth data. Therefore methods that take into account the inherent uncertainty are needed for being able to make robust inferences in the downstream analysis of such data. This poses...... a problem for a range of key summary statistics within populations genetics where existing methods are based on the assumption that the true genotypes are known. Motivated by this I present: 1) a new method for the estimation of relatedness between pairs of individuals, 2) a new method for estimating...

  6. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  7. Low-wave-number statistics of randomly advected passive scalars

    International Nuclear Information System (INIS)

    Kerstein, A.R.; McMurtry, P.A.

    1994-01-01

    A heuristic analysis of the decay of a passive scalar field subject to statistically steady random advection, predicts two low-wave-number spectral scaling regimes analogous to the similarity states previously identified by Chasnov [Phys. Fluids 6, 1036 (1994)]. Consequences of their predicted coexistence in a single flow are examined. The analysis is limited to the idealized case of narrow band advection. To complement the analysis, and to extend the predictions to physically more realistic advection processes, advection diffusion is simulated using a one-dimensional stochastic model. An experimental test of the predictions is proposed

  8. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  9. Realistic edge field model code REFC for designing and study of isochronous cyclotron

    International Nuclear Information System (INIS)

    Ismail, M.

    1989-01-01

    The focussing properties and the requirements for isochronism in cyclotron magnet configuration are well-known in hard edge field model. The fact that they quite often change considerably in realistic field can be attributed mainly to the influence of the edge field. A solution to this problem requires a field model which allows a simple construction of equilibrium orbit and yield simple formulae. This can be achieved by using a fitted realistic edge field (Hudson et al 1975) in the region of the pole edge and such a field model is therefore called a realistic edge field model. A code REFC based on realistic edge field model has been developed to design the cyclotron sectors and the code FIELDER has been used to study the beam properties. In this report REFC code has been described along with some relevant explaination of the FIELDER code. (author). 11 refs., 6 figs

  10. Realistic terrain visualization based on 3D virtual world technology

    Science.gov (United States)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  11. Realistic Affective Forecasting: The Role of Personality

    Science.gov (United States)

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-01-01

    Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463

  12. Role of sufficient statistics in stochastic thermodynamics and its implication to sensory adaptation

    Science.gov (United States)

    Matsumoto, Takumi; Sagawa, Takahiro

    2018-04-01

    A sufficient statistic is a significant concept in statistics, which means a probability variable that has sufficient information required for an inference task. We investigate the roles of sufficient statistics and related quantities in stochastic thermodynamics. Specifically, we prove that for general continuous-time bipartite networks, the existence of a sufficient statistic implies that an informational quantity called the sensory capacity takes the maximum. Since the maximal sensory capacity imposes a constraint that the energetic efficiency cannot exceed one-half, our result implies that the existence of a sufficient statistic is inevitably accompanied by energetic dissipation. We also show that, in a particular parameter region of linear Langevin systems there exists the optimal noise intensity at which the sensory capacity, the information-thermodynamic efficiency, and the total entropy production are optimized at the same time. We apply our general result to a model of sensory adaptation of E. coli and find that the sensory capacity is nearly maximal with experimentally realistic parameters.

  13. Beyond the realist turn: a socio-material analysis of heart failure self-care.

    Science.gov (United States)

    McDougall, Allan; Kinsella, Elizabeth Anne; Goldszmidt, Mark; Harkness, Karen; Strachan, Patricia; Lingard, Lorelei

    2018-01-01

    For patients living with chronic illnesses, self-care has been linked with positive outcomes such as decreased hospitalisation, longer lifespan, and improved quality of life. However, despite calls for more and better self-care interventions, behaviour change trials have repeatedly fallen short on demonstrating effectiveness. The literature on heart failure (HF) stands as a case in point, and a growing body of HF studies advocate realist approaches to self-care research and policymaking. We label this trend the 'realist turn' in HF self-care. Realist evaluation and realist interventions emphasise that the relationship between self-care interventions and positive health outcomes is not fixed, but contingent on social context. This paper argues socio-materiality offers a productive framework to expand on the idea of social context in realist accounts of HF self-care. This study draws on 10 interviews as well as researcher reflections from a larger study exploring health care teams for patients with advanced HF. Leveraging insights from actor-network theory (ANT), this study provides two rich narratives about the contextual factors that influence HF self-care. These descriptions portray not self-care contexts but self-care assemblages, which we discuss in light of socio-materiality. © 2018 Foundation for the Sociology of Health & Illness.

  14. Realistic assessment of doses to members of the public according to council directive 96/29/Euratom and common practice in Germany

    International Nuclear Information System (INIS)

    Hornung-Lauxmann, L.; Steiner, M.

    2003-01-01

    According to Article 45 of the Council Directive 96/29/Euratom competent authorities of member states are obliged to assess radiation doses to the population as a whole and to reference groups of the population as realistic as possible. In EU member states different approaches are in use, both to identify reference groups and to assess doses. In addition realistic dose assessments are in general not systematically performed. Hence, the European Commission commissioned a study on the realistic assessment of radiation doses due to radioactive discharges from nuclear installations under normal operation. Based on this study a guide including recommendations for dose assessment has been established by an Article 31 expert group, which is applicable to retrospective and prospective dose assessment as well. The most important recommendation is to use as much site-specific information as possible. Concerning the dose assessment of reference groups it is recommended to use critical groups. Critical groups comprise of persons who receive the highest effective doses because of their living habits. In the opinion of EU experts it is sufficient to take into account only the age groups of one-year-olds, ten-years-olds and adults. Exposure pathways have been sub-divided into three groups: Those, which have to be considered in any case, those, which contribute significantly to dose depending on local conditions, and those, which only insignificantly contribute to dose. It is recommended to use only models for dose assessment which are robust, appropriate, and validated against measurements. In Germany, (hypothetical) reference persons are used for the assessment of radiation exposure to the population due to radioactive discharges from nuclear installations under normal operation. They are assumed to live at the most disadvantageous location around the site and to eat foodstuff produced at the most disadvantageous locations. For this purpose models, generalized assumptions and

  15. A model independent safeguard against background mismodeling for statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro; Budnik, Ranny [Department of Particle Physics and Astrophysics, Weizmann Institute of Science, Herzl St. 234, Rehovot (Israel); Rauch, Ludwig, E-mail: nadav.priel@weizmann.ac.il, E-mail: rauch@mpi-hd.mpg.de, E-mail: hagar.landsman@weizmann.ac.il, E-mail: alessandro.manfredini@weizmann.ac.il, E-mail: ran.budnik@weizmann.ac.il [Teilchen- und Astroteilchenphysik, Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany)

    2017-05-01

    We propose a safeguard procedure for statistical inference that provides universal protection against mismodeling of the background. The method quantifies and incorporates the signal-like residuals of the background model into the likelihood function, using information available in a calibration dataset. This prevents possible false discovery claims that may arise through unknown mismodeling, and corrects the bias in limit setting created by overestimated or underestimated background. We demonstrate how the method removes the bias created by an incomplete background model using three realistic case studies.

  16. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    When treatment effects of active labour market programmes are heterogeneous in an observable way  across the population, the allocation of the unemployed into different programmes becomes a particularly  important issue. In this paper, we present a statistical model designed to improve the present...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  17. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  18. Role-playing for more realistic technical skills training.

    Science.gov (United States)

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  19. Toward realistic pursuit-evasion using a roadmap-based approach

    KAUST Repository

    Rodriguez, Samuel

    2011-05-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. Our approach can be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility computations that allow evading agents to hide in crowds or behind hills. We demonstrate the utility of this approach on mobile robots and in simulation for a variety of scenarios including pursuit-evasion and tag on terrains, in multi-level buildings, and in crowds. © 2011 IEEE.

  20. Impacts of Realistic Urban Heating, Part I: Spatial Variability of Mean Flow, Turbulent Exchange and Pollutant Dispersion

    Science.gov (United States)

    Nazarian, Negin; Martilli, Alberto; Kleissl, Jan

    2018-03-01

    As urbanization progresses, more realistic methods are required to analyze the urban microclimate. However, given the complexity and computational cost of numerical models, the effects of realistic representations should be evaluated to identify the level of detail required for an accurate analysis. We consider the realistic representation of surface heating in an idealized three-dimensional urban configuration, and evaluate the spatial variability of flow statistics (mean flow and turbulent fluxes) in urban streets. Large-eddy simulations coupled with an urban energy balance model are employed, and the heating distribution of urban surfaces is parametrized using sets of horizontal and vertical Richardson numbers, characterizing thermal stratification and heating orientation with respect to the wind direction. For all studied conditions, the thermal field is strongly affected by the orientation of heating with respect to the airflow. The modification of airflow by the horizontal heating is also pronounced for strongly unstable conditions. The formation of the canyon vortices is affected by the three-dimensional heating distribution in both spanwise and streamwise street canyons, such that the secondary vortex is seen adjacent to the windward wall. For the dispersion field, however, the overall heating of urban surfaces, and more importantly, the vertical temperature gradient, dominate the distribution of concentration and the removal of pollutants from the building canyon. Accordingly, the spatial variability of concentration is not significantly affected by the detailed heating distribution. The analysis is extended to assess the effects of three-dimensional surface heating on turbulent transfer. Quadrant analysis reveals that the differential heating also affects the dominance of ejection and sweep events and the efficiency of turbulent transfer (exuberance) within the street canyon and at the roof level, while the vertical variation of these parameters is less

  1. Technical Basis Document: A Statistical Basis for Interpreting Urinary Excretion of Plutonium Based on Accelerator Mass Spectrometry (AMS) for Selected Atoll Populations in the Marshall Islands

    International Nuclear Information System (INIS)

    Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G

    2007-01-01

    We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from 239 Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ( 239 Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of 239 Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both

  2. Statistical fluctuations of electromagnetic transition intensities in pf-shell nuclei

    International Nuclear Information System (INIS)

    Hamoudi, A.; Nazmitdinov, R.G.; Shakhaliev, E.; Alhassid, Y.

    2000-01-01

    We study the fluctuation properties of ΔT = 0 electromagnetic transition intensities in A ∼ 60 nuclei within the framework of the interacting shell model, using a realistic effective interaction for pf-shell nuclei with a 56 Ni core. It is found that the B(E2) and the ΔJ ≠ 0 distributions are well described by the Gaussian orthogonal ensemble of random matrices (Porter-Thomas distribution) independently of the isobaric quantum number T Z . However, the statistics of the B(M1) transitions with Δ = 0 are sensitive to T Z : T Z = 1 nuclei exhibit a Porter-Thomas distribution, while a significant deviation from the GOE statistics is observed for self-conjugate nuclei (T Z = 0). Similar results are found for A = 22 sd-shell nuclei

  3. DATA ON YOUTH, 1967, A STATISTICAL DOCUMENT.

    Science.gov (United States)

    SCHEIDER, GEORGE

    THE DATA IN THIS REPORT ARE STATISTICS ON YOUTH THROUGHOUT THE UNITED STATES AND IN NEW YORK STATE. INCLUDED ARE DATA ON POPULATION, SCHOOL STATISTICS, EMPLOYMENT, FAMILY INCOME, JUVENILE DELINQUENCY AND YOUTH CRIME (INCLUDING NEW YORK CITY FIGURES), AND TRAFFIC ACCIDENTS. THE STATISTICS ARE PRESENTED IN THE TEXT AND IN TABLES AND CHARTS. (NH)

  4. Entrepreneurial Education: A Realistic Alternative for Women and Minorities.

    Science.gov (United States)

    Steward, James F.; Boyd, Daniel R.

    1989-01-01

    Entrepreneurial education is a valid, realistic occupational training alternative for minorities and women in business. Entrepreneurship requires that one become involved with those educational programs that contribute significantly to one's success. (Author)

  5. Time-of-Flight Measurements as a Possible Method to Observe Anyonic Statistics

    Science.gov (United States)

    Umucalılar, R. O.; Macaluso, E.; Comparin, T.; Carusotto, I.

    2018-06-01

    We propose a standard time-of-flight experiment as a method for observing the anyonic statistics of quasiholes in a fractional quantum Hall state of ultracold atoms. The quasihole states can be stably prepared by pinning the quasiholes with localized potentials and a measurement of the mean square radius of the freely expanding cloud, which is related to the average total angular momentum of the initial state, offers direct signatures of the statistical phase. Our proposed method is validated by Monte Carlo calculations for ν =1 /2 and 1 /3 fractional quantum Hall liquids containing a realistic number of particles. Extensions to quantum Hall liquids of light and to non-Abelian anyons are briefly discussed.

  6. Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES

    Directory of Open Access Journals (Sweden)

    Westhorp Gill

    2011-08-01

    Full Text Available Abstract Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards statement (comparable to CONSORT or PRISMA of publication standards for such reviews, published in an open

  7. HIV among immigrants living in high-income countries: a realist review of evidence to guide targeted approaches to behavioural HIV prevention

    Directory of Open Access Journals (Sweden)

    McMahon Tadgh

    2012-11-01

    Full Text Available Abstract Background Immigrants from developing and middle-income countries are an emerging priority in HIV prevention in high-income countries. This may be explained in part by accelerating international migration and population mobility. However, it may also be due to the vulnerabilities of immigrants including social exclusion along with socioeconomic, cultural and language barriers to HIV prevention. Contemporary thinking on effective HIV prevention stresses the need for targeted approaches that adapt HIV prevention interventions according to the cultural context and population being addressed. This review of evidence sought to generate insights into targeted approaches in this emerging area of HIV prevention. Methods We undertook a realist review to answer the research question: ‘How are HIV prevention interventions in high-income countries adapted to suit immigrants’ needs?’ A key goal was to uncover underlying theories or mechanisms operating in behavioural HIV prevention interventions with immigrants, to uncover explanations as how and why they work (or not for particular groups in particular contexts, and thus to refine the underlying theories. The realist review mapped seven initial mechanisms underlying culturally appropriate HIV prevention with immigrants. Evidence from intervention studies and qualitative studies found in systematic searches was then used to test and refine these seven mechanisms. Results Thirty-four intervention studies and 40 qualitative studies contributed to the analysis and synthesis of evidence. The strongest evidence supported the role of ‘consonance’ mechanisms, indicating the pivotal need to incorporate cultural values into the intervention content. Moderate evidence was found to support the role of three other mechanisms – ‘understanding’, ‘specificity’ and ‘embeddedness’ – which indicated that using the language of immigrants, usually the ‘mother tongue’, targeting (in terms

  8. HIV among immigrants living in high-income countries: a realist review of evidence to guide targeted approaches to behavioural HIV prevention

    Science.gov (United States)

    2012-01-01

    Background Immigrants from developing and middle-income countries are an emerging priority in HIV prevention in high-income countries. This may be explained in part by accelerating international migration and population mobility. However, it may also be due to the vulnerabilities of immigrants including social exclusion along with socioeconomic, cultural and language barriers to HIV prevention. Contemporary thinking on effective HIV prevention stresses the need for targeted approaches that adapt HIV prevention interventions according to the cultural context and population being addressed. This review of evidence sought to generate insights into targeted approaches in this emerging area of HIV prevention. Methods We undertook a realist review to answer the research question: ‘How are HIV prevention interventions in high-income countries adapted to suit immigrants’ needs?’ A key goal was to uncover underlying theories or mechanisms operating in behavioural HIV prevention interventions with immigrants, to uncover explanations as how and why they work (or not) for particular groups in particular contexts, and thus to refine the underlying theories. The realist review mapped seven initial mechanisms underlying culturally appropriate HIV prevention with immigrants. Evidence from intervention studies and qualitative studies found in systematic searches was then used to test and refine these seven mechanisms. Results Thirty-four intervention studies and 40 qualitative studies contributed to the analysis and synthesis of evidence. The strongest evidence supported the role of ‘consonance’ mechanisms, indicating the pivotal need to incorporate cultural values into the intervention content. Moderate evidence was found to support the role of three other mechanisms – ‘understanding’, ‘specificity’ and ‘embeddedness’ – which indicated that using the language of immigrants, usually the ‘mother tongue’, targeting (in terms of ethnicity) and the use of

  9. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  10. Statistical and signal-processing concepts in surface metrology

    International Nuclear Information System (INIS)

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors

  11. Statistical and signal-processing concepts in surface metrology

    Energy Technology Data Exchange (ETDEWEB)

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors.

  12. Nuclear Statistical Equilibrium for compact stars: modelling the nuclear energy functional

    International Nuclear Information System (INIS)

    Aymard, Francois

    2015-01-01

    The core collapse supernova is one of the most powerful known phenomena in the universe. It results from the explosion of very massive stars after they have burnt all their fuel. The hot compact remnant, the so-called proto-neutron star, cools down to become an inert catalyzed neutron star. The dynamics and structure of compact stars, that is core collapse supernovae, proto-neutron stars and neutron stars, are still not fully understood and are currently under active research, in association with astrophysical observations and nuclear experiments. One of the key components for modelling compact stars concerns the Equation of State. The task of computing a complete realistic consistent Equation of State for all such stars is challenging because a wide range of densities, proton fractions and temperatures is spanned. This thesis deals with the microscopic modelling of the structure and internal composition of baryonic matter with nucleonic degrees of freedom in compact stars, in order to obtain a realistic unified Equation of State. In particular, we are interested in a formalism which can be applied both at sub-saturation and super-saturation densities, and which gives in the zero temperature limit results compatible with the microscopic Hartree-Fock-Bogoliubov theory with modern realistic effective interactions constrained on experimental nuclear data. For this purpose, we present, for sub-saturated matter, a Nuclear Statistical Equilibrium model which corresponds to a statistical superposition of finite configurations, the so-called Wigner-Seitz cells. Each cell contains a nucleus, or cluster, embedded in a homogeneous electron gas as well as a homogeneous neutron and proton gas. Within each cell, we investigate the different components of the nuclear energy of clusters in interaction with gases. The use of the nuclear mean-field theory for the description of both the clusters and the nucleon gas allows a theoretical consistency with the treatment at saturation

  13. Principles of maximally classical and maximally realistic quantum ...

    Indian Academy of Sciences (India)

    Principles of maximally classical and maximally realistic quantum mechanics. S M ROY. Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India. Abstract. Recently Auberson, Mahoux, Roy and Singh have proved a long standing conjecture of Roy and Singh: In 2N-dimensional phase space, ...

  14. Impact of some types of mass gatherings on current suicide risk in an urban population: statistical and negative binominal regression analysis of time series.

    Science.gov (United States)

    Usenko, Vasiliy S; Svirin, Sergey N; Shchekaturov, Yan N; Ponarin, Eduard D

    2014-04-04

    Many studies have investigated the impact of a wide range of social events on suicide-related behaviour. However, these studies have predominantly examined national events. The aim of this study is to provide a statistical evaluation of the relationship between mass gatherings in some relatively small urban sub-populations and the general suicide rates of a major city. The data were gathered in the Ukrainian city of Dnipropetrovsk, with a population of 1 million people, in 2005-2010. Suicide attempts, suicides, and the total amount of suicide-related behaviours were registered daily for each sex. Bivariate and multivariate statistical analysis, including negative binomial regression, were applied to assess the risk of suicide-related behaviour in the city's general population for 7 days before and after 427 mass gatherings, such as concerts, football games, and non-regular mass events organized by the Orthodox Church and new religious movements. The bivariate and multivariate statistical analyses found significant changes in some suicide-related behaviour rates in the city's population after certain kinds of mass gatherings. In particular, we observed an increased relative risk (RR) of male suicide-related behaviour after a home defeat of the local football team (RR = 1.32, p = 0.047; regression coefficient beta = 0.371, p = 0.002), and an increased risk of male suicides (RR = 1.29, p = 0.006; beta =0.255, p = 0.002), male suicide-related behaviour (RR = 1.25, p = 0.019; beta =0.251, p football games and mass events organized by new religious movements involved a relatively small part of an urban population (1.6 and 0.3%, respectively), we observed a significant increase of the some suicide-related behaviour rates in the whole population. It is likely that the observed effect on suicide-related behaviour is related to one's personal presence at the event rather than to its broadcast. Our findings can be explained largely in

  15. Segmenting CT prostate images using population and patient-specific statistics for radiotherapy

    International Nuclear Information System (INIS)

    Feng, Qianjin; Foskey, Mark; Chen Wufan; Shen Dinggang

    2010-01-01

    Purpose: In the segmentation of sequential treatment-time CT prostate images acquired in image-guided radiotherapy, accurately capturing the intrapatient variation of the patient under therapy is more important than capturing interpatient variation. However, using the traditional deformable-model-based segmentation methods, it is difficult to capture intrapatient variation when the number of samples from the same patient is limited. This article presents a new deformable model, designed specifically for segmenting sequential CT images of the prostate, which leverages both population and patient-specific statistics to accurately capture the intrapatient variation of the patient under therapy. Methods: The novelty of the proposed method is twofold: First, a weighted combination of gradient and probability distribution function (PDF) features is used to build the appearance model to guide model deformation. The strengths of each feature type are emphasized by dynamically adjusting the weight between the profile-based gradient features and the local-region-based PDF features during the optimization process. An additional novel aspect of the gradient-based features is that, to alleviate the effect of feature inconsistency in the regions of gas and bone adjacent to the prostate, the optimal profile length at each landmark is calculated by statistically investigating the intensity profile in the training set. The resulting gradient-PDF combined feature produces more accurate and robust segmentations than general gradient features. Second, an online learning mechanism is used to build shape and appearance statistics for accurately capturing intrapatient variation. Results: The performance of the proposed method was evaluated on 306 images of the 24 patients. Compared to traditional gradient features, the proposed gradient-PDF combination features brought 5.2% increment in the success ratio of segmentation (from 94.1% to 99.3%). To evaluate the effectiveness of online

  16. Segmenting CT prostate images using population and patient-specific statistics for radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Qianjin; Foskey, Mark; Chen Wufan; Shen Dinggang [Biomedical Engineering College, South Medical University, Guangzhou (China) and Department of Radiology, University of North Carolina, Chapel Hill, North Carolina 27510 (United States); Department of Radiation Oncology, University of North Carolina, Chapel Hill, North Carolina 27599 (United States); Biomedical Engineering College, South Medical University, Guangzhou 510510 (China); Department of Radiology, University of North Carolina, Chapel Hill, North Carolina 27510 (United States)

    2010-08-15

    Purpose: In the segmentation of sequential treatment-time CT prostate images acquired in image-guided radiotherapy, accurately capturing the intrapatient variation of the patient under therapy is more important than capturing interpatient variation. However, using the traditional deformable-model-based segmentation methods, it is difficult to capture intrapatient variation when the number of samples from the same patient is limited. This article presents a new deformable model, designed specifically for segmenting sequential CT images of the prostate, which leverages both population and patient-specific statistics to accurately capture the intrapatient variation of the patient under therapy. Methods: The novelty of the proposed method is twofold: First, a weighted combination of gradient and probability distribution function (PDF) features is used to build the appearance model to guide model deformation. The strengths of each feature type are emphasized by dynamically adjusting the weight between the profile-based gradient features and the local-region-based PDF features during the optimization process. An additional novel aspect of the gradient-based features is that, to alleviate the effect of feature inconsistency in the regions of gas and bone adjacent to the prostate, the optimal profile length at each landmark is calculated by statistically investigating the intensity profile in the training set. The resulting gradient-PDF combined feature produces more accurate and robust segmentations than general gradient features. Second, an online learning mechanism is used to build shape and appearance statistics for accurately capturing intrapatient variation. Results: The performance of the proposed method was evaluated on 306 images of the 24 patients. Compared to traditional gradient features, the proposed gradient-PDF combination features brought 5.2% increment in the success ratio of segmentation (from 94.1% to 99.3%). To evaluate the effectiveness of online

  17. Toward the M(F)--Theory Embedding of Realistic Free-Fermion Models

    CERN Document Server

    Berglund, P; Faraggi, A E; Nanopoulos, Dimitri V; Qiu, Z; Berglund, Per; Ellis, John; Faraggi, Alon E.; Qiu, Zongan

    1998-01-01

    We construct a Landau-Ginzburg model with the same data and symmetries as a $Z_2\\times Z_2$ orbifold that corresponds to a class of realistic free-fermion models. Within the class of interest, we show that this orbifolding connects between different $Z_2\\times Z_2$ orbifold models and commutes with the mirror symmetry. Our work suggests that duality symmetries previously discussed in the context of specific $M$ and $F$ theory compactifications may be extended to the special $Z_2\\times Z_2$ orbifold that characterizes realistic free-fermion models.

  18. Ultra-realistic imaging advanced techniques in analogue and digital colour holography

    CERN Document Server

    Bjelkhagen, Hans

    2013-01-01

    Ultra-high resolution holograms are now finding commercial and industrial applications in such areas as holographic maps, 3D medical imaging, and consumer devices. Ultra-Realistic Imaging: Advanced Techniques in Analogue and Digital Colour Holography brings together a comprehensive discussion of key methods that enable holography to be used as a technique of ultra-realistic imaging.After a historical review of progress in holography, the book: Discusses CW recording lasers, pulsed holography lasers, and reviews optical designs for many of the principal laser types with emphasis on attaining th

  19. Statistical guidelines for detecting past population shifts using ancient DNA

    DEFF Research Database (Denmark)

    Mourier, Tobias; Ho, Simon Y. W.; Gilbert, Tom

    2012-01-01

    Populations carry a genetic signal of their demographic past, providing an opportunity for investigating the processes that shaped their evolution. Our ability to infer population histories can be enhanced by including ancient DNA data. Using serial-coalescent simulations and a range of both...... quantitative and temporal sampling schemes, we test the power of ancient mitochondrial sequences and nuclear single-nucleotide polymorphisms (SNPs) to detect past population bottlenecks. Within our simulated framework, mitochondrial sequences have only limited power to detect subtle bottlenecks and/or fast...... results provide useful guidelines for scaling sampling schemes and for optimizing our ability to infer past population dynamics. In addition, our results suggest that many ancient DNA studies may face power issues in detecting moderate demographic collapses and/or highly dynamic demographic shifts when...

  20. U.S. Population Data 1969-2016 - SEER Population Data

    Science.gov (United States)

    Download county population estimates used in SEER*Stat to calculate cancer incidence and mortality rates. The estimates are a modification of the U.S. Census Bureau's Population Estimates Program, in collaboration with National Center for Health Statistics.

  1. A robust statistical method for association-based eQTL analysis.

    Directory of Open Access Journals (Sweden)

    Ning Jiang

    Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  2. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms

    International Nuclear Information System (INIS)

    Tang Jie; Nett, Brian E; Chen Guanghong

    2009-01-01

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  3. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  4. Powerful Inference With the D-Statistic on Low-Coverage Whole-Genome Data

    DEFF Research Database (Denmark)

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2018-01-01

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness...... is assessed by evaluating specific coincidences of alleles between the groups. When working with high throughput sequencing data calling genotypes accurately is not always possible, therefore the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring...... much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction...

  5. On the impacts of coarse-scale models of realistic roughness on a forward-facing step turbulent flow

    International Nuclear Information System (INIS)

    Wu, Yanhua; Ren, Huiying

    2013-01-01

    Highlights: ► Discrete wavelet transform was used to produce coarse-scale models of roughness. ► PIV were performed in a forward-facing step flow with roughness of different scales. ► Impacts of roughness scales on various turbulence statistics were studied. -- Abstract: The present work explores the impacts of the coarse-scale models of realistic roughness on the turbulent boundary layers over forward-facing steps. The surface topographies of different scale resolutions were obtained from a novel multi-resolution analysis using discrete wavelet transform. PIV measurements are performed in the streamwise–wall-normal (x–y) planes at two different spanwise positions in turbulent boundary layers at Re h = 3450 and δ/h = 8, where h is the mean step height and δ is the incoming boundary layer thickness. It was observed that large-scale but low-amplitude roughness scales had small effects on the forward-facing step turbulent flow. For the higher-resolution model of the roughness, the turbulence characteristics within 2h downstream of the steps are observed to be distinct from those over the original realistic rough step at a measurement position where the roughness profile possesses a positive slope immediately after the step’s front. On the other hand, much smaller differences exist in the flow characteristics at the other measurement position whose roughness profile possesses a negative slope following the step’s front

  6. Realistic retrospective dose assessments to members of the public around Spanish nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, M.A., E-mail: majg@csn.es [Consejo de Seguridad Nuclear (CSN), Pedro Justo Dorado Dellmans 11, E-28040 Madrid (Spain); Martin-Valdepenas, J.M.; Garcia-Talavera, M.; Martin-Matarranz, J.L.; Salas, M.R.; Serrano, J.I.; Ramos, L.M. [Consejo de Seguridad Nuclear (CSN), Pedro Justo Dorado Dellmans 11, E-28040 Madrid (Spain)

    2011-11-15

    In the frame of an epidemiological study carried out in the influence areas around the Spanish nuclear facilities (ISCIII-CSN, 2009. Epidemiological Study of The Possible Effect of Ionizing Radiations Deriving from The Operation of Spanish Nuclear Fuel Cycle Facilities on The Health of The Population Living in Their Vicinity. Final report December 2009. Ministerio de Ciencia e Innovacion, Instituto de Salud Carlos III, Consejo de Seguridad Nuclear. Madrid. Available from: (http://www.csn.es/images/stories/actualidad{sub d}atos/especiales/epidemiologico/epidemiological{sub s}tudy.pdf)), annual effective doses to public have been assessed by the Spanish Nuclear Safety Council (CSN) for over 45 years using a retrospective realistic-dose methodology. These values are compared with data from natural radiation exposure. For the affected population, natural radiation effective doses are in average 2300 times higher than effective doses due to the operation of nuclear installations (nuclear power stations and fuel cycle facilities). When considering the impact on the whole Spanish population, effective doses attributable to nuclear facilities represent in average 3.5 x 10{sup -5} mSv/y, in contrast to 1.6 mSv/y from natural radiation or 1.3 mSv/y from medical exposures. - Highlights: > Most comprehensive dose assessment to public by nuclear facilities ever done in Spain. > Dose to public is dominated by liquid effluent pathways for the power stations. > Dose to public is dominated by Rn inhalation for milling and mining facilities. > Average annual doses to public in influence areas are negligible (10 {mu}Sv/y or less). > Doses from facilities average 3.5 x 10{sup -2} {mu}Sv/y per person onto whole Spanish population.

  7. Realistic retrospective dose assessments to members of the public around Spanish nuclear facilities

    International Nuclear Information System (INIS)

    Jimenez, M.A.; Martin-Valdepenas, J.M.; Garcia-Talavera, M.; Martin-Matarranz, J.L.; Salas, M.R.; Serrano, J.I.; Ramos, L.M.

    2011-01-01

    In the frame of an epidemiological study carried out in the influence areas around the Spanish nuclear facilities (ISCIII-CSN, 2009. Epidemiological Study of The Possible Effect of Ionizing Radiations Deriving from The Operation of Spanish Nuclear Fuel Cycle Facilities on The Health of The Population Living in Their Vicinity. Final report December 2009. Ministerio de Ciencia e Innovacion, Instituto de Salud Carlos III, Consejo de Seguridad Nuclear. Madrid. Available from: (http://www.csn.es/images/stories/actualidad_datos/especiales/epidemiologico/epidemiological_study.pdf)), annual effective doses to public have been assessed by the Spanish Nuclear Safety Council (CSN) for over 45 years using a retrospective realistic-dose methodology. These values are compared with data from natural radiation exposure. For the affected population, natural radiation effective doses are in average 2300 times higher than effective doses due to the operation of nuclear installations (nuclear power stations and fuel cycle facilities). When considering the impact on the whole Spanish population, effective doses attributable to nuclear facilities represent in average 3.5 x 10 -5 mSv/y, in contrast to 1.6 mSv/y from natural radiation or 1.3 mSv/y from medical exposures. - Highlights: → Most comprehensive dose assessment to public by nuclear facilities ever done in Spain. → Dose to public is dominated by liquid effluent pathways for the power stations. → Dose to public is dominated by Rn inhalation for milling and mining facilities. → Average annual doses to public in influence areas are negligible (10 μSv/y or less). → Doses from facilities average 3.5 x 10 -2 μSv/y per person onto whole Spanish population.

  8. Characteristics of level-spacing statistics in chaotic graphene billiards.

    Science.gov (United States)

    Huang, Liang; Lai, Ying-Cheng; Grebogi, Celso

    2011-03-01

    A fundamental result in nonrelativistic quantum nonlinear dynamics is that the spectral statistics of quantum systems that possess no geometric symmetry, but whose classical dynamics are chaotic, are described by those of the Gaussian orthogonal ensemble (GOE) or the Gaussian unitary ensemble (GUE), in the presence or absence of time-reversal symmetry, respectively. For massless spin-half particles such as neutrinos in relativistic quantum mechanics in a chaotic billiard, the seminal work of Berry and Mondragon established the GUE nature of the level-spacing statistics, due to the combination of the chirality of Dirac particles and the confinement, which breaks the time-reversal symmetry. A question is whether the GOE or the GUE statistics can be observed in experimentally accessible, relativistic quantum systems. We demonstrate, using graphene confinements in which the quasiparticle motions are governed by the Dirac equation in the low-energy regime, that the level-spacing statistics are persistently those of GOE random matrices. We present extensive numerical evidence obtained from the tight-binding approach and a physical explanation for the GOE statistics. We also find that the presence of a weak magnetic field switches the statistics to those of GUE. For a strong magnetic field, Landau levels become influential, causing the level-spacing distribution to deviate markedly from the random-matrix predictions. Issues addressed also include the effects of a number of realistic factors on level-spacing statistics such as next nearest-neighbor interactions, different lattice orientations, enhanced hopping energy for atoms on the boundary, and staggered potential due to graphene-substrate interactions.

  9. Realistic electricity market simulator for energy and economic studies

    International Nuclear Information System (INIS)

    Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul

    2007-01-01

    Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)

  10. Estimation of population dose from all sources in Japan

    International Nuclear Information System (INIS)

    Kusama, Tomoko; Nakagawa, Takeo; Kai, Michiaki; Yoshizawa, Yasuo

    1988-01-01

    The purposes of estimation of population doses are to understand the per-caput doses of the public member from each artificial radiation source and to determine the proportion contributed of the doses from each individual source to the total irradiated population. We divided the population doses into two categories: individual-related and source-related population doses. The individual-related population dose is estimated based on the maximum assumption for use in allocation of the dose limits for members of the public. The source-related population dose is estimated both to justify the sources and practices and to optimize radiation protection. The source-related population dose, therefore, should be estimated as realistically as possible. We investigated all sources that caused exposure to the population in Japan from the above points of view

  11. Fully Realistic Multi-Criteria Multi-Modal Routing

    OpenAIRE

    Gündling, Felix; Keyhani, Mohammad Hossein; Schnee, Mathias; Weihe, Karsten

    2014-01-01

    We report on a multi-criteria search system, in which the German long- and short-distance trains, local public transport, walking, private car, private bike, and taxi are incorporated. The system is fully realistic. Three optimization criteria are addressed: travel time, travel cost, and convenience. Our algorithmic approach computes a complete Pareto set of reasonable connections. The computational study demonstrates that, even in such a large-scale, highly complex scenario, approp...

  12. Quantum cryptography: towards realization in realistic conditions

    International Nuclear Information System (INIS)

    Imoto, M.; Koashi, M.; Shimizu, K.; Huttner, B.

    1997-01-01

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author)

  13. Using Microsoft Excel to teach statistics in a graduate advanced practice nursing program.

    Science.gov (United States)

    DiMaria-Ghalili, Rose Ann; Ostrow, C Lynne

    2009-02-01

    This article describes the authors' experiences during 3 years of using Microsoft Excel to teach graduate-level statistics, as part of the research core required by the American Association of Colleges of Nursing for all professional graduate nursing programs. The advantages to using this program instead of specialized statistical programs are ease of accessibility, increased transferability of skills, and reduced cost for students. The authors share their insight about realistic goals for teaching statistics to master's-level students and the resources that are available to faculty to help them to learn and use Excel in their courses. Several online sites that are excellent resources for both faculty and students are discussed. Detailed attention is given to an online course (Carnegie-Mellon University Open Learning Initiative, n.d.), which the authors have incorporated into their graduate-level research methods course.

  14. A possible definition of a {\\it Realistic} Physics Theory

    OpenAIRE

    Gisin, Nicolas

    2014-01-01

    A definition of a {\\it Realistic} Physics Theory is proposed based on the idea that, at all time, the set of physical properties possessed (at that time) by a system should unequivocally determine the probabilities of outcomes of all possible measurements.

  15. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  16. Student Work Experience: A Realistic Approach to Merchandising Education.

    Science.gov (United States)

    Horridge, Patricia; And Others

    1980-01-01

    Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)

  17. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  18. Building Realistic Mobility Models for Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Adrian Pullin

    2018-04-01

    Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To

  19. The New Migration Statistics: A Good Choice made by the INE (Spanish Institute for National Statistics [ENG

    Directory of Open Access Journals (Sweden)

    Carmen Ródenas

    2013-01-01

    Full Text Available The Spanish Institute for National Statistics (INE has decided to create new Migration Statistics (Estadística de Migraciones based upon Residential Variation Statistics (Estadística de Variaciones Residenciales. This article presents arguments to support this decision, in view of the continued lack of consistency found among the sources of the Spanish statistics system for measuring population mobility. Specifically, an insight is provided into the problems of underestimation and internal inconsistency in the Spanish Labour Force Survey when measuring immigration rates, based upon discrepancies identified in the three international immigration flow series produced by this survey.

  20. Diversity of Poissonian populations.

    Science.gov (United States)

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Populations represented by collections of points scattered randomly on the real line are ubiquitous in science and engineering. The statistical modeling of such populations leads naturally to Poissonian populations-Poisson processes on the real line with a distinguished maximal point. Poissonian populations are infinite objects underlying key issues in statistical physics, probability theory, and random fractals. Due to their infiniteness, measuring the diversity of Poissonian populations depends on the lower-bound cut-off applied. This research characterizes the classes of Poissonian populations whose diversities are invariant with respect to the cut-off level applied and establishes an elemental connection between these classes and extreme-value theory. The measures of diversity considered are variance and dispersion, Simpson's index and inverse participation ratio, Shannon's entropy and Rényi's entropy, and Gini's index.

  1. Non realist tendencies in new Turkish cinema

    OpenAIRE

    Can, İclal

    2016-01-01

    http://hdl.handle.net/11693/29111 Thesis (M.S.): Bilkent University, Department of Communication and Design, İhsan Doğramacı Bilkent University, 2016. Includes bibliographical references (leaves 113-123). The realist tendency which had been dominant in cinema became more apparent with Italian neorealism affecting other national cinemas to a large extent. With the changing and developing socio economic and cultural dynamics, realism gradually has stopped being a natural const...

  2. Security of quantum cryptography with realistic sources

    International Nuclear Information System (INIS)

    Lutkenhaus, N.

    1999-01-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  3. Quantum cryptography: towards realization in realistic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Imoto, M; Koashi, M; Shimizu, K [NTT Basic Research Laboratories, 3-1 Morinosato-Wakamiya, Atsugi-shi, Kanagawa 243-01 (Japan); Huttner, B [Universite de Geneve, GAP-optique, 20, Rue de l` Ecole de Medecine CH1211, Geneve 4 (Switzerland)

    1997-05-11

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author) 15 refs., 1 fig., 1 tab.

  4. Security of quantum cryptography with realistic sources

    Energy Technology Data Exchange (ETDEWEB)

    Lutkenhaus, N [Helsinki Institute of Physics, P.O. Box 9, 00014 Helsingin yliopisto (Finland)

    1999-08-01

    The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)

  5. Rethinking Mathematics Teaching in Liberia: Realistic Mathematics Education

    Science.gov (United States)

    Stemn, Blidi S.

    2017-01-01

    In some African cultures, the concept of division does not necessarily mean sharing money or an item equally. How an item is shared might depend on the ages of the individuals involved. This article describes the use of the Realistic Mathematics Education (RME) approach to teach division word problems involving money in a 3rd-grade class in…

  6. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  7. Realistic full wave modeling of focal plane array pixels.

    Energy Technology Data Exchange (ETDEWEB)

    Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.

    2017-11-01

    Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.

  8. Facilities upgrade for natural forces: traditional vs. realistic approach

    International Nuclear Information System (INIS)

    Terkun, V.

    1985-01-01

    The traditional method utilized for upgrading existing buildings and equipment involves the following steps: performs structural study using finite element analysis and some in situ testing; compare predicted member forces/stresses to material code allowables; determine strengthening schemes for those structural members judged to be weak; estimate cost for required upgrades. This approach will result in structural modifications that are not only conservative but very expensive as well. The realistic structural evaluation approach uses traditional data to predict structural weaknesses as a final step. Next, using considerable information now available for buildings and equipment exposed to natural hazards, engineering judgments about structures being evaluated can be made with a great deal of confidence. This approach does not eliminate conservatism entirely, but it does reduce it to a reasonable and realistic level. As a result, the upgrade cost goes down without compromising the low risk necessary for vital facilities

  9. SEER Statistics | DCCPS/NCI/NIH

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  10. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  11. STATISTICAL LITERACY: EXCESSIVE REQUIREMENT OR TODAY'S NECESSITY

    Directory of Open Access Journals (Sweden)

    M. Potapova

    2014-04-01

    Full Text Available The purpose of this paper is to investigate the concept of literacy of population and the evolution of literacy according to the requirements of nowadays. The approaches of scientists to multifaceted literacy of population and its necessity are considered. Special attention is paid to statistical literacy of population and its necessity in the life of every modern person.

  12. Gauge coupling unification in realistic free-fermionic string models

    International Nuclear Information System (INIS)

    Dienes, K.R.; Faraggi, A.E.

    1995-01-01

    We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)

  13. Statistical representation of quantum states

    Energy Technology Data Exchange (ETDEWEB)

    Montina, A [Dipartimento di Fisica, Universita di Firenze, Via Sansone 1, 50019 Sesto Fiorentino (Italy)

    2007-05-15

    In the standard interpretation of quantum mechanics, the state is described by an abstract wave function in the representation space. Conversely, in a realistic interpretation, the quantum state is replaced by a probability distribution of physical quantities. Bohm mechanics is a consistent example of realistic theory, where the wave function and the particle positions are classically defined quantities. Recently, we proved that the probability distribution in a realistic theory cannot be a quadratic function of the quantum state, in contrast to the apparently obvious suggestion given by the Born rule for transition probabilities. Here, we provide a simplified version of this proof.

  14. The Electrostatic Instability for Realistic Pair Distributions in Blazar/EBL Cascades

    Science.gov (United States)

    Vafin, S.; Rafighi, I.; Pohl, M.; Niemiec, J.

    2018-04-01

    This work revisits the electrostatic instability for blazar-induced pair beams propagating through the intergalactic medium (IGM) using linear analysis and PIC simulations. We study the impact of the realistic distribution function of pairs resulting from the interaction of high-energy gamma-rays with the extragalactic background light. We present analytical and numerical calculations of the linear growth rate of the instability for the arbitrary orientation of wave vectors. Our results explicitly demonstrate that the finite angular spread of the beam dramatically affects the growth rate of the waves, leading to the fastest growth for wave vectors quasi-parallel to the beam direction and a growth rate at oblique directions that is only a factor of 2–4 smaller compared to the maximum. To study the nonlinear beam relaxation, we performed PIC simulations that take into account a realistic wide-energy distribution of beam particles. The parameters of the simulated beam-plasma system provide an adequate physical picture that can be extrapolated to realistic blazar-induced pairs. In our simulations, the beam looses only 1% of its energy, and we analytically estimate that the beam would lose its total energy over about 100 simulation times. An analytical scaling is then used to extrapolate the parameters of realistic blazar-induced pair beams. We find that they can dissipate their energy slightly faster by the electrostatic instability than through inverse-Compton scattering. The uncertainties arising from, e.g., details of the primary gamma-ray spectrum are too large to make firm statements for individual blazars, and an analysis based on their specific properties is required.

  15. Global aesthetic surgery statistics: a closer look.

    Science.gov (United States)

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  16. Game statistics for the island of Olkiluoto in 2005-2006

    International Nuclear Information System (INIS)

    Oja, S.

    2006-11-01

    The game statistics for the island of Olkiluoto was updated in February 2006. The estimate of game populations in Olkiluoto was done on the basis of interviews of local hunters and available statistical materials. The collected data were compared to earlier studies of game animals done in Olkiluoto. The populations of Elk and White-tailed Deer are stable, and the population of Roe Deer is increasing significantly. The populations of small mammal predators (American Mink, Raccoon Dog, Red Fox) are very high level, despite of intensive hunting. Other game animals like waterfowls are hunted moderately and the amount of catches are small. (orig.)

  17. Imaginary populations

    Directory of Open Access Journals (Sweden)

    A. Martínez–Abraín

    2010-01-01

    Full Text Available A few years ago, Camus & Lima (2002 wrote an essay to stimulate ecologists to think about how we define and use a fundamental concept in ecology: the population. They concluded, concurring with Berryman (2002, that a population is "a group of individuals of the same species that live together in an area of sufficient size to permit normal dispersal and/or migration behaviour and in which population changes are largely the results of birth and death processes". They pointed out that ecologists often forget "to acknowledge that many study units are neither natural nor even units in terms of constituting a population system", and hence claimed that we "require much more accuracy than in past decades in order to be more effective to characterize populations and predict their behaviour". They stated that this is especially necessary "in disciplines such as conservation biology or resource pest management, to avoid reaching wrong conclusions or making inappropriate decisions". As a population ecologist and conservation biologist I totally agree with these authors and, like them, I be¬lieve that greater precision and care is needed in the use and definition of ecological terms. The point I wish to stress here is that we ecologists tend to forget that when we use statistical tools to infer results from our sample to a population we work with what statisticians term "imaginary", "hypothetical" or "potential" popula¬tions. As Zar (1999 states, if our sample data consist of 40 measurements of growth rate in guinea pigs "the population about which conclusions might be drawn is the growth rates of all the guinea pigs that conceivably might have been administered the same food supplement under identical conditions". Such a population does not really exist, and hence it is considered a hypothetical or imaginary population. Compare that definition with the population concept that would be in our minds when performing such measurements. We would probably

  18. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    JU Yang; YANG YongMing; SONG ZhenDuo; XU WenJing

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were In-vestigated by means of CT scanning tests of sandstones. The centroidal coordl-nares of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob-ability density functions upon which the random distribution of pore position, dis-tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex-amine the stress distribution, the pattern of element failure and the inoaculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  19. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  20. Water Fluoridation Statistics - Percent of PWS population receiving fluoridated water

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2000-2014. Water Fluoridation Statistics is a biennial report of the percentage and number of people receiving fluoridated water from 2000 through 2014, originally...

  1. Water Fluoridation Statistics - Percent of PWS population receiving fluoridated water

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2000-2014 Water Fluoridation Statistics is a biennial report of the percentage and number of people receiving fluoridated water from 2000 through 2014, originally...

  2. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  3. Neutron dosemeter responses in workplace fields and the implications of using realistic neutron calibration fields

    International Nuclear Information System (INIS)

    Thomas, D.J.; Horwood, N.; Taylor, G.C.

    1999-01-01

    The use of realistic neutron calibration fields to overcome some of the problems associated with the response functions of presently available dosemeters, both area survey instruments and personal dosemeters, has been investigated. Realistic calibration fields have spectra which, compared to conventional radionuclide source based calibration fields, more closely match those of the workplace fields in which dosemeters are used. Monte Carlo simulations were performed to identify laboratory systems which would produce appropriate workplace-like calibration fields. A detailed analysis was then undertaken of the predicted under- and over-responses of dosemeters in a wide selection of measured workplace field spectra assuming calibration in a selection of calibration fields. These included both conventional radionuclide source calibration fields, and also several proposed realistic calibration fields. The present state of the art for dosemeter performance, and the possibilities of improving accuracy by using realistic calibration fields are both presented. (author)

  4. The doubly conditioned frequency spectrum does not distinguish between ancient population structure and hybridization

    KAUST Repository

    Eriksson, Anders

    2014-03-13

    Distinguishing between hybridization and population structure in the ancestral species is a key challenge in our understanding of how permeable species boundaries are to gene flow. The doubly conditioned frequency spectrum (dcfs) has been argued to be a powerful metric to discriminate between these two explanations, and it was used to argue for hybridization between Neandertal and anatomically modern humans. The shape of the observed dcfs for these two species cannot be reproduced by a model that represents ancient population structure in Africa with two populations, while adding hybridization produces realistic shapes. In this letter, we show that this result is a consequence of the spatial coarseness of the demographic model and that a spatially structured stepping stone model can generate realistic dcfs without hybridization. This result highlights how inferences on hybridization between recently diverged species can be strongly affected by the choice of how population structure is represented in the underlying demographic model. We also conclude that the dcfs has limited power in distinguishing between the signals left by hybridization and ancient structure. 2014 The Author.

  5. The doubly conditioned frequency spectrum does not distinguish between ancient population structure and hybridization

    KAUST Repository

    Eriksson, Anders; Manica, Andrea

    2014-01-01

    Distinguishing between hybridization and population structure in the ancestral species is a key challenge in our understanding of how permeable species boundaries are to gene flow. The doubly conditioned frequency spectrum (dcfs) has been argued to be a powerful metric to discriminate between these two explanations, and it was used to argue for hybridization between Neandertal and anatomically modern humans. The shape of the observed dcfs for these two species cannot be reproduced by a model that represents ancient population structure in Africa with two populations, while adding hybridization produces realistic shapes. In this letter, we show that this result is a consequence of the spatial coarseness of the demographic model and that a spatially structured stepping stone model can generate realistic dcfs without hybridization. This result highlights how inferences on hybridization between recently diverged species can be strongly affected by the choice of how population structure is represented in the underlying demographic model. We also conclude that the dcfs has limited power in distinguishing between the signals left by hybridization and ancient structure. 2014 The Author.

  6. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  7. International Management: Creating a More Realistic Global Planning Environment.

    Science.gov (United States)

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  8. Predicting the sensitivity of populations from individual exposure to chemicals: the role of ecological interactions.

    Science.gov (United States)

    Gabsi, Faten; Schäffer, Andreas; Preuss, Thomas G

    2014-07-01

    Population responses to chemical stress exposure are influenced by nonchemical, environmental processes such as species interactions. A realistic quantification of chemical toxicity to populations calls for the use of methodologies that integrate these multiple stress effects. The authors used an individual-based model for Daphnia magna as a virtual laboratory to determine the influence of ecological interactions on population sensitivity to chemicals with different modes of action on individuals. In the model, hypothetical chemical toxicity targeted different vital individual-level processes: reproduction, survival, feeding rate, or somatic growth rate. As for species interactions, predatory and competition effects on daphnid populations were implemented following a worst-case approach. The population abundance was simulated at different food levels and exposure scenarios, assuming exposure to chemical stress solely or in combination with either competition or predation. The chemical always targeted one vital endpoint. Equal toxicity-inhibition levels differently affected the population abundance with and without species interactions. In addition, population responses to chemicals were highly sensitive to the environmental stressor (predator or competitor) and to the food level. Results show that population resilience cannot be attributed to chemical stress only. Accounting for the relevant ecological interactions would reduce uncertainties when extrapolating effects of chemicals from individuals to the population level. Validated population models should be used for a more realistic risk assessment of chemicals. © 2014 SETAC.

  9. Ultra-realistic 3-D imaging based on colour holography

    International Nuclear Information System (INIS)

    Bjelkhagen, H I

    2013-01-01

    A review of recent progress in colour holography is provided with new applications. Colour holography recording techniques in silver-halide emulsions are discussed. Both analogue, mainly Denisyuk colour holograms, and digitally-printed colour holograms are described and their recent improvements. An alternative to silver-halide materials are the panchromatic photopolymer materials such as the DuPont and Bayer photopolymers which are covered. The light sources used to illuminate the recorded holograms are very important to obtain ultra-realistic 3-D images. In particular the new light sources based on RGB LEDs are described. They show improved image quality over today's commonly used halogen lights. Recent work in colour holography by holographers and companies in different countries around the world are included. To record and display ultra-realistic 3-D images with perfect colour rendering are highly dependent on the correct recording technique using the optimal recording laser wavelengths, the availability of improved panchromatic recording materials and combined with new display light sources.

  10. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  11. Please don't misuse the museum: 'declines' may be statistical.

    Science.gov (United States)

    Campbell Grant, Evan H

    2015-03-01

    Detecting declines in populations at broad spatial scales takes enormous effort, and long-term data are often more sparse than is desired for estimating trends, identifying drivers for population changes, framing conservation decisions, or taking management actions. Museum records and historic data can be available at large scales across multiple decades, and are therefore an attractive source of information on the comparative status of populations. However, changes in populations may be real (e.g. in response to environmental covariates) or resulting from variation in our ability to observe the true population response (also possibly related to environmental covariates). This is a (statistical) nuisance in understanding the true status of a population. Evaluating statistical hypotheses alongside more interesting ecological ones is important in the appropriate use of museum data. Two statistical considerations are generally applicable to use of museum records: first without initial random sampling, comparison with contemporary results cannot provide inference to the entire range of a species, and second the availability of only some individuals in a population may respond to environmental changes. Changes in the availability of individuals may reduce the proportion of the population that is present and able to be counted on a given survey event, resulting in an apparent decline even when population size is stable. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  12. Understanding how appraisal of doctors produces its effects: a realist review protocol.

    Science.gov (United States)

    Brennan, Nicola; Bryce, Marie; Pearson, Mark; Wong, Geoff; Cooper, Chris; Archer, Julian

    2014-06-23

    UK doctors are now required to participate in revalidation to maintain their licence to practise. Appraisal is a fundamental component of revalidation. However, objective evidence of appraisal changing doctors' behaviour and directly resulting in improved patient care is limited. In particular, it is not clear how the process of appraisal is supposed to change doctors' behaviour and improve clinical performance. The aim of this research is to understand how and why appraisal of doctors is supposed to produce its effect. Realist review is a theory-driven interpretive approach to evidence synthesis. It applies realist logic of inquiry to produce an explanatory analysis of an intervention that is, what works, for whom, in what circumstances, in what respects. Using a realist review approach, an initial programme theory of appraisal will be developed by consulting with key stakeholders in doctors' appraisal in expert panels (ethical approval is not required), and by searching the literature to identify relevant existing theories. The search strategy will have a number of phases including a combination of: (1) electronic database searching, for example, EMBASE, MEDLINE, the Cochrane Library, ASSIA, (2) 'cited by' articles search, (3) citation searching, (4) contacting authors and (5) grey literature searching. The search for evidence will be iteratively extended and refocused as the review progresses. Studies will be included based on their ability to provide data that enable testing of the programme theory. Data extraction will be conducted, for example, by note taking and annotation at different review stages as is consistent with the realist approach. The evidence will be synthesised using realist logic to interrogate the final programme theory of the impact of appraisal on doctors' performance. The synthesis results will be written up according to RAMESES guidelines and disseminated through peer-reviewed publication and presentations. The protocol is registered with

  13. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  14. Statistics of Electron Avalanches and Streamers

    Directory of Open Access Journals (Sweden)

    T. Ficker

    2007-01-01

    Full Text Available We have studied the severe systematic deviations of populations of electron avalanches from the Furry distribution, which has been held to be the statistical law corresponding to them, and a possible explanation has been sought. A  new theoretical concept based on fractal avalanche multiplication has been proposed and is shown to be a convenient candidate for explaining these deviations from Furry statistics

  15. Optimizing human activity patterns using global sensitivity analysis.

    Science.gov (United States)

    Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M

    2014-12-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.

  16. Derivation of cell population kinetic parameters from clinical statistical data (program RAD3)

    International Nuclear Information System (INIS)

    Cohen, L.

    1978-01-01

    Cellular lethality models generally require up to 6 parameters to simulate a clinical course of fractionated radiation therapy and to derive an estimate of the cellular surviving fraction for a given treatment scheme. These parameters are the mean cellular lethal dose, the extrapolation number, the ratio of sublethal to irreparable events, the regeneration rate, the repopulation limit (cell cycles), and a field-size or tumor-volume factor. A computer program (RAD3) was designed to derive best-fitting values for these parameters in relation to available clinical data based on the assumption that if a number of different fractionation schemes yield similar reactions, the cellular surviving fractions will be about equal in each instance. Parameters were derived for a variety of human tissues from which realistic iso-effect functions could be generated

  17. Improving Mathematics Teaching in Kindergarten with Realistic Mathematical Education

    Science.gov (United States)

    Papadakis, Stamatios; Kalogiannakis, Michail; Zaranis, Nicholas

    2017-01-01

    The present study investigates and compares the influence of teaching Realistic Mathematics on the development of mathematical competence in kindergarten. The sample consisted of 231 Greek kindergarten students. For the implementation of the survey, we conducted an intervention, which included one experimental and one control group. Children in…

  18. Development of realistic thermal hydraulic system analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, B. D; Kim, K. D. [and others

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.

  19. Development of realistic thermal hydraulic system analysis code

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, B. D; Kim, K. D.

    2002-05-01

    The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others

  20. An inexpensive yet realistic model for teaching vasectomy

    Directory of Open Access Journals (Sweden)

    Taylor M. Coe

    2015-04-01

    Full Text Available Purpose Teaching the no-scalpel vasectomy is important, since vasectomy is a safe, simple, and cost-effective method of contraception. This minimally invasive vasectomy technique involves delivering the vas through the skin with specialized tools. This technique is associated with fewer complications than the traditional incisional vasectomy (1. One of the most challenging steps is the delivery of the vas through a small puncture in the scrotal skin, and there is a need for a realistic and inexpensive scrotal model for beginning learners to practice this step. Materials and Methods After careful observation using several scrotal models while teaching residents and senior trainees, we developed a simplified scrotal model that uses only three components–bicycle inner tube, latex tubing, and a Penrose drain. Results This model is remarkably realistic and allows learners to practice a challenging step in the no-scalpel vasectomy. The low cost and simple construction of the model allows wide dissemination of training in this important technique. Conclusions We propose a simple, inexpensive model that will enable learners to master the hand movements involved in delivering the vas through the skin while mitigating the risks of learning on patients.

  1. Towards an agential realist concept of learning

    DEFF Research Database (Denmark)

    Plauborg, Helle

    2018-01-01

    Drawing on agential realism, this article explores how learning can be understood. An agential realist way of thinking about learning is sensitive to the complexity that characterises learning as a phenomenon. Thus, learning is seen as a dynamic and emergent phenomenon, constantly undergoing...... processes of becoming and expanding the range of components involved in such constitutive processes. With inspiration from Barad’s theorisation of spatiality, temporality and the interdependence of discourse and materiality, this article focuses on timespacemattering and material-discursivity. Concepts...

  2. Protocol: a realist review of user fee exemption policies for health services in Africa.

    Science.gov (United States)

    Robert, Emilie; Ridde, Valéry; Marchal, Bruno; Fournier, Pierre

    2012-01-01

    Background Four years prior to the Millenium Development Goals (MDGs) deadline, low- and middle-income countries and international stakeholders are looking for evidence-based policies to improve access to healthcare for the most vulnerable populations. User fee exemption policies are one of the potential solutions. However, the evidence is disparate, and systematic reviews have failed to provide valuable lessons. The authors propose to produce an innovative synthesis of the available evidence on user fee exemption policies in Africa to feed the policy-making process. Methods The authors will carry out a realist review to answer the following research question: what are the outcomes of user fee exemption policies implemented in Africa? why do they produce such outcomes? and what contextual elements come into play? This type of review aims to understand how contextual elements influence the production of outcomes through the activation of specific mechanisms, in the form of context-mechanism-outcome configurations. The review will be conducted in five steps: (1) identifying with key stakeholders the mechanisms underlying user fee exemption policies to develop the analytical framework, (2) searching for and selecting primary data, (3) assessing the quality of evidence using the Mixed-Method Appraisal Tool, (4) extracting the data using the analytical framework and (5) synthesising the data in the form of context-mechanism-outcomes configurations. The output will be a middle-range theory specifying how user fee exemption policies work, for what populations and under what circumstances. Ethics and dissemination The two main target audiences are researchers who are looking for examples to implement a realist review, and policy-makers and international stakeholders looking for lessons learnt on user fee exemption. For the latter, a knowledge-sharing strategy involving local scientific and policy networks will be implemented. The study has been approved by the ethics

  3. INDONESIA’S DEATH PENALTY EXECUTION FROM THE REALIST VIEW OF INTERNATIONAL LAW

    Directory of Open Access Journals (Sweden)

    Alia Azmi

    2015-06-01

    Full Text Available During the first half of 2015, Indonesia executed fourteen prisoners who had been convicted of smuggling drugs to and from Indonesia. Twelve of them were foreigners. This execution led to withdrawal of the ambassador of Brazil, Netherlands, and Australia, whose citizens are among those executed. Criticism came from around the world, and small number of Indonesians. Most critics cited human rights abuse; and death penalty is against international law. However, the lack of further explanation can make the statement misunderstood. The distinctive nature of international law is one factor that makes death penalty issue is still debatable. Another factor is the inconsistent world’s reaction on human rights issues, showing realistic behavior in international relations. Therefore it is important to understand the nature of international law from the realist perspective of international relations in explaining death penalty in Indonesia. The purpose of this paper is to elaborate Indonesia’s death penalty from the realist perspective of international law. Keywords: realism, international law, international relations, death penalty

  4. IBM parameters derived from realistic shell-model Hamiltonian via Hn-cooling method

    International Nuclear Information System (INIS)

    Nakada, Hitoshi

    1997-01-01

    There is a certain influence of non-collective degrees-of-freedom even in lowest-lying states of medium-heavy nuclei. This influence seems to be significant for some of the IBM parameters. In order to take it into account, several renormalization approaches have been applied. It has been shown in the previous studies that the influence of the G-pairs is important, but does not fully account for the fitted values. The influence of the non-collective components may be more serious when we take a realistic effective nucleonic interaction. To incorporate this influence into the IBM parameters, we employ the recently developed H n -cooling method. This method is applied to renormalize the wave functions of the states consisting of the SD-pairs, for the Cr-Fe nuclei. On this ground, the IBM Hamiltonian and transition operators are derived from corresponding realistic shell-model operators, for the Cr-Fe nuclei. Together with some features of the realistic interaction, the effects of the non-SD degrees-of-freedom are presented. (author)

  5. Realist review and synthesis of retention studies for health workers in rural and remote areas

    NARCIS (Netherlands)

    Dieleman, M.A.; Kane, Sumit; Zwanikken, Prisca A C; Gerretsen, Barend

    2011-01-01

    This report uses a realist review, which is a theory-based method, to address the questions of “why” and “how” certain rural retention interventions work better in some contexts and fail in others. Through applying a realist perspective to the review of these retention studies, a greater

  6. A realistic neural mass model of the cortex with laminar-specific connections and synaptic plasticity - evaluation with auditory habituation.

    Directory of Open Access Journals (Sweden)

    Peng Wang

    Full Text Available In this work we propose a biologically realistic local cortical circuit model (LCCM, based on neural masses, that incorporates important aspects of the functional organization of the brain that have not been covered by previous models: (1 activity dependent plasticity of excitatory synaptic couplings via depleting and recycling of neurotransmitters and (2 realistic inter-laminar dynamics via laminar-specific distribution of and connections between neural populations. The potential of the LCCM was demonstrated by accounting for the process of auditory habituation. The model parameters were specified using Bayesian inference. It was found that: (1 besides the major serial excitatory information pathway (layer 4 to layer 2/3 to layer 5/6, there exists a parallel "short-cut" pathway (layer 4 to layer 5/6, (2 the excitatory signal flow from the pyramidal cells to the inhibitory interneurons seems to be more intra-laminar while, in contrast, the inhibitory signal flow from inhibitory interneurons to the pyramidal cells seems to be both intra- and inter-laminar, and (3 the habituation rates of the connections are unsymmetrical: forward connections (from layer 4 to layer 2/3 are more strongly habituated than backward connections (from Layer 5/6 to layer 4. Our evaluation demonstrates that the novel features of the LCCM are of crucial importance for mechanistic explanations of brain function. The incorporation of these features into a mass model makes them applicable to modeling based on macroscopic data (like EEG or MEG, which are usually available in human experiments. Our LCCM is therefore a valuable building block for future realistic models of human cognitive function.

  7. Realistic Gamow shell model for resonance and continuum in atomic nuclei

    Science.gov (United States)

    Xu, F. R.; Sun, Z. H.; Wu, Q.; Hu, B. S.; Dai, S. J.

    2018-02-01

    The Gamow shell model can describe resonance and continuum for atomic nuclei. The model is established in the complex-moment (complex-k) plane of the Berggren coordinates in which bound, resonant and continuum states are treated on equal footing self-consistently. In the present work, the realistic nuclear force, CD Bonn, has been used. We have developed the full \\hat{Q}-box folded-diagram method to derive the realistic effective interaction in the model space which is nondegenerate and contains resonance and continuum channels. The CD-Bonn potential is renormalized using the V low-k method. With choosing 16O as the inert core, we have applied the Gamow shell model to oxygen isotopes.

  8. A statistical dynamics approach to the study of human health data: Resolving population scale diurnal variation in laboratory data

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2010-01-01

    Statistical physics and information theory is applied to the clinical chemistry measurements present in a patient database containing 2.5 million patients' data over a 20-year period. Despite the seemingly naive approach of aggregating all patients over all times (with respect to particular clinical chemistry measurements), both a diurnal signal in the decay of the time-delayed mutual information and the presence of two sub-populations with differing health are detected. This provides a proof in principle that the highly fragmented data in electronic health records has potential for being useful in defining disease and human phenotypes.

  9. Realistic tissue visualization using photoacoustic image

    Science.gov (United States)

    Cho, Seonghee; Managuli, Ravi; Jeon, Seungwan; Kim, Jeesu; Kim, Chulhong

    2018-02-01

    Visualization methods are very important in biomedical imaging. As a technology that understands life, biomedical imaging has the unique advantage of providing the most intuitive information in the image. This advantage of biomedical imaging can be greatly improved by choosing a special visualization method. This is more complicated in volumetric data. Volume data has the advantage of containing 3D spatial information. Unfortunately, the data itself cannot directly represent the potential value. Because images are always displayed in 2D space, visualization is the key and creates the real value of volume data. However, image processing of 3D data requires complicated algorithms for visualization and high computational burden. Therefore, specialized algorithms and computing optimization are important issues in volume data. Photoacoustic-imaging is a unique imaging modality that can visualize the optical properties of deep tissue. Because the color of the organism is mainly determined by its light absorbing component, photoacoustic data can provide color information of tissue, which is closer to real tissue color. In this research, we developed realistic tissue visualization using acoustic-resolution photoacoustic volume data. To achieve realistic visualization, we designed specialized color transfer function, which depends on the depth of the tissue from the skin. We used direct ray casting method and processed color during computing shader parameter. In the rendering results, we succeeded in obtaining similar texture results from photoacoustic data. The surface reflected rays were visualized in white, and the reflected color from the deep tissue was visualized red like skin tissue. We also implemented the CUDA algorithm in an OpenGL environment for real-time interactive imaging.

  10. Creating a Realistic Context for Team Projects in HCI

    NARCIS (Netherlands)

    Koppelman, Herman; van Dijk, Betsy

    2006-01-01

    Team projects are nowadays common practice in HCI education. This paper focuses on the role of clients and users in team projects in introductory HCI courses. In order to provide projects with a realistic context we invite people from industry to serve as clients for the student teams. Some of them

  11. Creating photo-realistic works in a 3D scene using layers styles to create an animation

    Science.gov (United States)

    Avramescu, A. M.

    2015-11-01

    Creating realist objects in a 3D scene is not an easy work. We have to be very careful to make the creation very detailed. If we don't know how to make these photo-realistic works, by using the techniques and a good reference photo we can create an amazing amount of detail and realism. For example, in this article there are some of these detailed methods from which we can learn the techniques necessary to make beautiful and realistic objects in a scene. More precisely, in this paper, we present how to create a 3D animated scene, mainly using the Pen Tool and Blending Options. Indeed, this work is based on teaching some simple ways of using the Layer Styles to create some great shadows, lights, textures and a realistic sense of 3 Dimension. The present work involves also showing how some interesting ways of using the illuminating and rendering options can create a realistic effect in a scene. Moreover, this article shows how to create photo realistic 3D models from a digital image. The present work proposes to present how to use Illustrator paths, texturing, basic lighting and rendering, how to apply textures and how to parent the building and objects components. We also propose to use this proposition to recreate smaller details or 3D objects from a 2D image. After a critic art stage, we are able now to present in this paper the architecture of a design method that proposes to create an animation. The aim is to create a conceptual and methodological tutorial to address this issue both scientifically and in practice. This objective also includes proposing, on strong scientific basis, a model that gives the possibility of a better understanding of the techniques necessary to create a realistic animation.

  12. A statistical survey of heat input parameters into the cusp thermosphere

    Science.gov (United States)

    Moen, J. I.; Skjaeveland, A.; Carlson, H. C.

    2017-12-01

    Based on three winters of observational data, we present those ionosphere parameters deemed most critical to realistic space weather ionosphere and thermosphere representation and prediction, in regions impacted by variability in the cusp. The CHAMP spacecraft revealed large variability in cusp thermosphere densities, measuring frequent satellite drag enhancements, up to doublings. The community recognizes a clear need for more realistic representation of plasma flows and electron densities near the cusp. Existing average-value models produce order of magnitude errors in these parameters, resulting in large under estimations of predicted drag. We fill this knowledge gap with statistics-based specification of these key parameters over their range of observed values. The EISCAT Svalbard Radar (ESR) tracks plasma flow Vi , electron density Ne, and electron, ion temperatures Te, Ti , with consecutive 2-3 minute windshield-wipe scans of 1000x500 km areas. This allows mapping the maximum Ti of a large area within or near the cusp with high temporal resolution. In magnetic field-aligned mode the radar can measure high-resolution profiles of these plasma parameters. By deriving statistics for Ne and Ti , we enable derivation of thermosphere heating deposition under background and frictional-drag-dominated magnetic reconnection conditions. We separate our Ne and Ti profiles into quiescent and enhanced states, which are not closely correlated due to the spatial structure of the reconnection foot point. Use of our data-based parameter inputs can make order of magnitude corrections to input data driving thermosphere models, enabling removal of previous two fold drag errors.

  13. Modeling of ultrasonic wave propagation in composite laminates with realistic discontinuity representation.

    Science.gov (United States)

    Zelenyak, Andreea-Manuela; Schorer, Nora; Sause, Markus G R

    2018-02-01

    This paper presents a method for embedding realistic defect geometries of a fiber reinforced material in a finite element modeling environment in order to simulate active ultrasonic inspection. When ultrasonic inspection is used experimentally to investigate the presence of defects in composite materials, the microscopic defect geometry may cause signal characteristics that are difficult to interpret. Hence, modeling of this interaction is key to improve our understanding and way of interpreting the acquired ultrasonic signals. To model the true interaction of the ultrasonic wave field with such defect structures as pores, cracks or delamination, a realistic three dimensional geometry reconstruction is required. We present a 3D-image based reconstruction process which converts computed tomography data in adequate surface representations ready to be embedded for processing with finite element methods. Subsequent modeling using these geometries uses a multi-scale and multi-physics simulation approach which results in quantitative A-Scan ultrasonic signals which can be directly compared with experimental signals. Therefore, besides the properties of the composite material, a full transducer implementation, piezoelectric conversion and simultaneous modeling of the attached circuit is applied. Comparison between simulated and experimental signals provides very good agreement in electrical voltage amplitude and the signal arrival time and thus validates the proposed modeling approach. Simulating ultrasound wave propagation in a medium with a realistic shape of the geometry clearly shows a difference in how the disturbance of the waves takes place and finally allows more realistic modeling of A-scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Visualization and modeling of sub-populations of compositional data: statistical methods illustrated by means of geochemical data from fumarolic fluids

    Science.gov (United States)

    Pawlowsky-Glahn, Vera; Buccianti, Antonella

    In the investigation of fluid samples of a volcanic system, collected during a given period of time, one of the main goals is to discover cause-effect relationships that allow us to explain changes in the chemical composition. They might be caused by physicochemical factors, such as temperature, pressure, or non-conservative behavior of some chemical constituents (addition or subtraction of material), among others. The presence of subgroups of observations showing different behavior is evidence of unusually complex situations, which might render even more difficult the analysis and interpretation of observed phenomena. These cases require appropriate statistical techniques as well as sound a priori hypothesis concerning underlying geological processes. The purpose of this article is to present the state of the art in the methodology for a better visualization of compositional data, as well as for detecting statistically significant sub-populations. The scheme of this article is to present first the application, and then the underlying methodology, with the aim of the first motivating the second. Thus, the first part has the goal to illustrate how to understand and interpret results, whereas the second is devoted to expose how to perform a study of this kind. The case study is related to the chemical composition of a fumarole of Vulcano Island (southern Italy), called F14. The volcanic activity at Vulcano Island is subject to a continuous program of geochemical surveillance from 1978 up to now and the large data set of observations contains the main chemical composition of volcanic gases as well as trace element concentrations in the condensates of fumarolic gases. Out of the complete set of measured components, the variables H2S, HF and As, determined in samples collected from 1978 to 1993 (As is not available in recent samples) are used to characterize two groups in the original population, which proved to be statistically distinct. The choice of the variables is

  15. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  16. A realist evaluation of the management of a well-performing regional hospital in Ghana.

    Science.gov (United States)

    Marchal, Bruno; Dedzo, McDamien; Kegels, Guy

    2010-01-25

    Realist evaluation offers an interesting approach to evaluation of interventions in complex settings, but has been little applied in health care. We report on a realist case study of a well performing hospital in Ghana and show how such a realist evaluation design can help to overcome the limited external validity of a traditional case study. We developed a realist evaluation framework for hypothesis formulation, data collection, data analysis and synthesis of the findings. Focusing on the role of human resource management in hospital performance, we formulated our hypothesis around the high commitment management concept. Mixed methods were used in data collection, including individual and group interviews, observations and document reviews. We found that the human resource management approach (the actual intervention) included induction of new staff, training and personal development, good communication and information sharing, and decentralised decision-making. We identified 3 additional practices: ensuring optimal physical working conditions, access to top managers and managers' involvement on the work floor. Teamwork, recognition and trust emerged as key elements of the organisational climate. Interviewees reported high levels of organisational commitment. The analysis unearthed perceived organisational support and reciprocity as underlying mechanisms that link the management practices with commitment. Methodologically, we found that realist evaluation can be fruitfully used to develop detailed case studies that analyse how management interventions work and in which conditions. Analysing the links between intervention, mechanism and outcome increases the explaining power, while identification of essential context elements improves the usefulness of the findings for decision-makers in other settings (external validity). We also identified a number of practical difficulties and priorities for further methodological development. This case suggests that a well

  17. A realist evaluation of the management of a well- performing regional hospital in Ghana

    Directory of Open Access Journals (Sweden)

    Kegels Guy

    2010-01-01

    Full Text Available Abstract Background Realist evaluation offers an interesting approach to evaluation of interventions in complex settings, but has been little applied in health care. We report on a realist case study of a well performing hospital in Ghana and show how such a realist evaluation design can help to overcome the limited external validity of a traditional case study. Methods We developed a realist evaluation framework for hypothesis formulation, data collection, data analysis and synthesis of the findings. Focusing on the role of human resource management in hospital performance, we formulated our hypothesis around the high commitment management concept. Mixed methods were used in data collection, including individual and group interviews, observations and document reviews. Results We found that the human resource management approach (the actual intervention included induction of new staff, training and personal development, good communication and information sharing, and decentralised decision-making. We identified 3 additional practices: ensuring optimal physical working conditions, access to top managers and managers' involvement on the work floor. Teamwork, recognition and trust emerged as key elements of the organisational climate. Interviewees reported high levels of organisational commitment. The analysis unearthed perceived organisational support and reciprocity as underlying mechanisms that link the management practices with commitment. Methodologically, we found that realist evaluation can be fruitfully used to develop detailed case studies that analyse how management interventions work and in which conditions. Analysing the links between intervention, mechanism and outcome increases the explaining power, while identification of essential context elements improves the usefulness of the findings for decision-makers in other settings (external validity. We also identified a number of practical difficulties and priorities for further

  18. Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity

    Science.gov (United States)

    Ingber, Lester

    1984-06-01

    A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.

  19. Development and application of KEPRI realistic evaluation methodology (KREM) for LB-LOCA

    International Nuclear Information System (INIS)

    Ban, Chang-Hwan; Lee, Sang-Yong; Sung, Chang-Kyung

    2004-01-01

    A realistic evaluation method for LB-LOCA of a PWR, KREM, is developed and its applicability is confirmed to a 3-loop Westinghouse plant in Korea. The method uses a combined code of CONTEMPT4/MOD5 and a modified RELAP5/MOD3.1. RELAP5 code calculates system thermal hydraulics with the containment backpressure calculated by CONTEMPT4, exchanging the mass/energy release and backpressure in every time step of RELAP5. The method is developed strictly following the philosophy of CSAU with a few improvements and differences. Elements and steps of KREM are shown in Figure this paper. Three elements of CSAU are maintained and the first element has no differences. An additional step of 'Check of Experimental Data Covering (EDC)' is embedded in element 2 in order to confirm the validity of code uncertainty parameters before applying them to plant calculations. The main idea to develop the EDC is to extrapolate the code accuracy which is determined in step 8 to the uncertainties of plant calculations. EDC is described in detail elsewhere and the basic concepts are explained in the later section of this paper. KREM adopts nonparametric statistics to quantify the overall uncertainty of a LB-LOCA at 95% probability and 95% confidence level from 59 plant calculations according to Wilks formula. These 59 calculations are performed in step 12 using code parameters determined in steps 8 and 9 and operation parameters from step 11. Scale biases are also evaluated in this step using the information of step 10. Uncertainties of code models and operation conditions are reflected in 59 plant calculations as multipliers to relevant parameters in the code or as input values simply. This paper gives the explanation on the overall structures of KREM and emphasizes its unique features. In addition, its applicability is confirmed to a 3-loop plant in Korea. KREM is developed for the realistic evaluation of LB-LOCA and its applicability is successfully demonstrated for the 3-loop power plants in

  20. Statistical model of exotic rotational correlations in emergent space-time

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan

    2017-06-06

    A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictions for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.

  1. Population Synthesis of Radio & Gamma-Ray Millisecond Pulsars

    Science.gov (United States)

    Frederick, Sara; Gonthier, P. L.; Harding, A. K.

    2014-01-01

    In recent years, the number of known gamma-ray millisecond pulsars (MSPs) in the Galactic disk has risen substantially thanks to confirmed detections by Fermi Gamma-ray Space Telescope (Fermi). We have developed a new population synthesis of gamma-ray and radio MSPs in the galaxy which uses Markov Chain Monte Carlo techniques to explore the large and small worlds of the model parameter space and allows for comparisons of the simulated and detected MSP distributions. The simulation employs empirical radio and gamma-ray luminosity models that are dependent upon the pulsar period and period derivative with freely varying exponents. Parameters associated with the birth distributions are also free to vary. The computer code adjusts the magnitudes of the model luminosities to reproduce the number of MSPs detected by a group of ten radio surveys, thus normalizing the simulation and predicting the MSP birth rates in the Galaxy. Computing many Markov chains leads to preferred sets of model parameters that are further explored through two statistical methods. Marginalized plots define confidence regions in the model parameter space using maximum likelihood methods. A secondary set of confidence regions is determined in parallel using Kuiper statistics calculated from comparisons of cumulative distributions. These two techniques provide feedback to affirm the results and to check for consistency. Radio flux and dispersion measure constraints have been imposed on the simulated gamma-ray distributions in order to reproduce realistic detection conditions. The simulated and detected distributions agree well for both sets of radio and gamma-ray pulsar characteristics, as evidenced by our various comparisons.

  2. Statistics of financial markets an introduction

    CERN Document Server

    Franke, Jürgen; Hafner, Christian Matthias

    2015-01-01

    Now in its fourth edition, this book offers a detailed yet concise introduction to the growing field of statistical applications in finance. The reader will learn the basic methods of evaluating option contracts, analyzing financial time series, selecting portfolios and managing risks based on realistic assumptions about market behavior. The focus is both on the fundamentals of mathematical finance and financial time series analysis, and on applications to given problems concerning financial markets, thus making the book the ideal basis for lectures, seminars and crash courses on the topic. For this new edition the book has been updated and extensively revised and now includes several new aspects, e.g. new chapters on long memory models, copulae and CDO valuation. Practical exercises with solutions have also been added. Both R and Matlab Code, together with the data, can be downloaded from the book’s product page and www.quantlet.de

  3. Predicting population extinction or disease outbreaks with stochastic models

    Directory of Open Access Journals (Sweden)

    Linda J. S. Allen

    2017-01-01

    Full Text Available Models of exponential growth, logistic growth and epidemics are common applications in undergraduate differential equation courses. The corresponding stochastic models are not part of these courses, although when population sizes are small their behaviour is often more realistic and distinctly different from deterministic models. For example, the randomness associated with births and deaths may lead to population extinction even in an exponentially growing population. Some background in continuous-time Markov chains and applications to populations, epidemics and cancer are presented with a goal to introduce this topic into the undergraduate mathematics curriculum that will encourage further investigation into problems on conservation, infectious diseases and cancer therapy. MATLAB programs for graphing sample paths of stochastic models are provided in the Appendix.

  4. The matchmaking paradox: a statistical explanation

    International Nuclear Information System (INIS)

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Medical surveys regarding the number of heterosexual partners per person yield different female and male averages-a result which, from a physical standpoint, is impossible. In this paper we term this puzzle the 'matchmaking paradox', and establish a statistical model explaining it. We consider a bipartite graph with N male and N female nodes (N >> 1), and B bonds connecting them (B >> 1). Each node is associated a random 'attractiveness level', and the bonds connect to the nodes randomly-with probabilities which are proportionate to the nodes' attractiveness levels. The population's average bonds-per-nodes B/N is estimated via a sample average calculated from a survey of size n (n >> 1). A comprehensive statistical analysis of this model is carried out, asserting that (i) the sample average well estimates the population average if and only if the attractiveness levels possess a finite mean; (ii) if the attractiveness levels are governed by a 'fat-tailed' probability law then the sample average displays wild fluctuations and strong skew-thus providing a statistical explanation to the matchmaking paradox.

  5. The Effect of Realistic Mathematics Education Approach on Students' Achievement And Attitudes Towards Mathematics

    OpenAIRE

    Effandi Zakaria; Muzakkir Syamaun

    2017-01-01

    This study was conducted to determine the effect of Realistic Mathematics Education Approach on mathematics achievement and student attitudes towards mathematics. This study also sought determine the relationship between student achievement and attitudes towards mathematics. This study used a quasi-experimental design conducted on 61 high school students at SMA Unggul Sigli. Students were divided into two groups, the treatment group $(n = 30)$ namely, the Realistic Mathematics Approach group ...

  6. Spectroscopy of light nuclei with realistic NN interaction JISP

    International Nuclear Information System (INIS)

    Shirokov, A. M.; Vary, J. P.; Mazur, A. I.; Weber, T. A.

    2008-01-01

    Recent results of our systematic ab initio studies of the spectroscopy of s- and p-shell nuclei in fully microscopic large-scale (up to a few hundred million basis functions) no-core shell-model calculations are presented. A new high-quality realistic nonlocal NN interaction JISP is used. This interaction is obtained in the J-matrix inverse-scattering approach (JISP stands for the J-matrix inverse-scattering potential) and is of the form of a small-rank matrix in the oscillator basis in each of the NN partial waves, providing a very fast convergence in shell-model studies. The current purely two-body JISP model of the nucleon-nucleon interaction JISP16 provides not only an excellent description of two-nucleon data (deuteron properties and np scattering) with χ 2 /datum = 1.05 but also a better description of a wide range of observables (binding energies, spectra, rms radii, quadrupole moments, electromagnetic-transition probabilities, etc.) in all s-and p-shell nuclei than the best modern interaction models combining realistic nucleon-nucleon and three-nucleon interactions.

  7. Powerful Inference with the D-Statistic on Low-Coverage Whole-Genome Data.

    Science.gov (United States)

    Soraggi, Samuele; Wiuf, Carsten; Albrechtsen, Anders

    2018-02-02

    The detection of ancient gene flow between human populations is an important issue in population genetics. A common tool for detecting ancient admixture events is the D-statistic. The D-statistic is based on the hypothesis of a genetic relationship that involves four populations, whose correctness is assessed by evaluating specific coincidences of alleles between the groups. When working with high-throughput sequencing data, calling genotypes accurately is not always possible; therefore, the D-statistic currently samples a single base from the reads of one individual per population. This implies ignoring much of the information in the data, an issue especially striking in the case of ancient genomes. We provide a significant improvement to overcome the problems of the D-statistic by considering all reads from multiple individuals in each population. We also apply type-specific error correction to combat the problems of sequencing errors, and show a way to correct for introgression from an external population that is not part of the supposed genetic relationship, and how this leads to an estimate of the admixture rate. We prove that the D-statistic is approximated by a standard normal distribution. Furthermore, we show that our method outperforms the traditional D-statistic in detecting admixtures. The power gain is most pronounced for low and medium sequencing depth (1-10×), and performances are as good as with perfectly called genotypes at a sequencing depth of 2×. We show the reliability of error correction in scenarios with simulated errors and ancient data, and correct for introgression in known scenarios to estimate the admixture rates. Copyright © 2018 Soraggi et al.

  8. Biochemical transport modeling, estimation, and detection in realistic environments

    Science.gov (United States)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  9. Realistic shell-model calculations for Sn isotopes

    International Nuclear Information System (INIS)

    Covello, A.; Andreozzi, F.; Coraggio, L.; Gargano, A.; Porrino, A.

    1997-01-01

    We report on a shell-model study of the Sn isotopes in which a realistic effective interaction derived from the Paris free nucleon-nucleon potential is employed. The calculations are performed within the framework of the seniority scheme by making use of the chain-calculation method. This provides practically exact solutions while cutting down the amount of computational work required by a standard seniority-truncated calculation. The behavior of the energy of several low-lying states in the isotopes with A ranging from 122 to 130 is presented and compared with the experimental one. (orig.)

  10. MANAJEMEN LABA: PERILAKU MANAJEMEN OPPORTUNISTIC ATAU REALISTIC ?

    Directory of Open Access Journals (Sweden)

    I Nyoman Wijana Asmara Putra

    2011-01-01

    Full Text Available Earnings management is a still attractive issue. It is often associatedwith negative behavior conducted by management for its own interest. In fact,it also has different side to be examined. There is another motivation to do so,such as to improve the company’s operation. This literature study aims toreview management motivation of doing earnings management, whetheropportunistic or realistic. What conflict that earnings management brings,what pro and cons about it, what would happen if earnings is not managed,whether the company would be better off or worse off.

  11. Environmental accounting and statistics

    International Nuclear Information System (INIS)

    Bartelmus, P.L.P.

    1992-01-01

    The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs

  12. Realistic minimum accident source terms - Evaluation, application, and risk acceptance

    International Nuclear Information System (INIS)

    Angelo, P. L.

    2009-01-01

    The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)

  13. Game statistics for the island of Olkiluoto in 2010-2011

    International Nuclear Information System (INIS)

    Nieminen, M.; Niemi, M.; Jussila, I.

    2011-10-01

    The game statistics for the island of Olkiluoto were updated in the summer 2011 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2010-2011. The moose population is stable when compared with the previous year. The white-tailed deer population is stable or slightly increasing when compared with the previous year. The changes in the roe deer population are not accurately known, but population size varies somewhat from year to year. The number of hunted raccoon dogs approximately doubled in the latest hunting season. Altogether two waterfowl were hunted in 2010 (17 in the previous year). The populations of mountain hare and red squirrel are abundant, and the number of hunted mountain hares approximately doubled when compared with the previous hunting season. The brown hare population is still small. In the winter, there were observations of one lynx spending time in the area. (orig.)

  14. Game statistics for the island of Olkiluoto in 2010-2011

    Energy Technology Data Exchange (ETDEWEB)

    Nieminen, M. [Faunatica Oy, Espoo (Finland); Niemi, M. [Helsinki Univ. (Finland), Dept. of Forest Sciences; Jussila, I. [Turku Univ. (Finland), Satakunta Environmental Research Inst.

    2011-10-15

    The game statistics for the island of Olkiluoto were updated in the summer 2011 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2010-2011. The moose population is stable when compared with the previous year. The white-tailed deer population is stable or slightly increasing when compared with the previous year. The changes in the roe deer population are not accurately known, but population size varies somewhat from year to year. The number of hunted raccoon dogs approximately doubled in the latest hunting season. Altogether two waterfowl were hunted in 2010 (17 in the previous year). The populations of mountain hare and red squirrel are abundant, and the number of hunted mountain hares approximately doubled when compared with the previous hunting season. The brown hare population is still small. In the winter, there were observations of one lynx spending time in the area. (orig.)

  15. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  16. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  17. Realistic diversity loss and variation in soil depth independently affect community-level plant nitrogen use.

    Science.gov (United States)

    Selmants, Paul C; Zavaleta, Erika S; Wolf, Amelia A

    2014-01-01

    Numerous experiments have demonstrated that diverse plant communities use nitrogen (N) more completely and efficiently, with implications for how species conservation efforts might influence N cycling and retention in terrestrial ecosystems. However, most such experiments have randomly manipulated species richness and minimized environmental heterogeneity, two design aspects that may reduce applicability to real ecosystems. Here we present results from an experiment directly comparing how realistic and randomized plant species losses affect plant N use across a gradient of soil depth in a native-dominated serpentine grassland in California. We found that the strength of the species richness effect on plant N use did not increase with soil depth in either the realistic or randomized species loss scenarios, indicating that the increased vertical heterogeneity conferred by deeper soils did not lead to greater complementarity among species in this ecosystem. Realistic species losses significantly reduced plant N uptake and altered N-use efficiency, while randomized species losses had no effect on plant N use. Increasing soil depth positively affected plant N uptake in both loss order scenarios but had a weaker effect on plant N use than did realistic species losses. Our results illustrate that realistic species losses can have functional consequences that differ distinctly from randomized losses, and that species diversity effects can be independent of and outweigh those of environmental heterogeneity on ecosystem functioning. Our findings also support the value of conservation efforts aimed at maintaining biodiversity to help buffer ecosystems against increasing anthropogenic N loading.

  18. Nuclear properties with realistic Hamiltonians through spectral distribution theory

    International Nuclear Information System (INIS)

    Vary, J.P.; Belehrad, R.; Dalton, B.J.

    1979-01-01

    Motivated by the need of non-perturbative methods for utilizing realistic nuclear Hamiltonians H, the authors use spectral distribution theory, based on calculated moments of H, to obtain specific bulk and valence properties of finite nuclei. The primary emphasis here is to present results for the binding energies of nuclei obtained with and without an assumed core. (Auth.)

  19. Automated Finger Spelling by Highly Realistic 3D Animation

    Science.gov (United States)

    Adamo-Villani, Nicoletta; Beni, Gerardo

    2004-01-01

    We present the design of a new 3D animation tool for self-teaching (signing and reading) finger spelling the first basic component in learning any sign language. We have designed a highly realistic hand with natural animation of the finger motions. Smoothness of motion (in real time) is achieved via programmable blending of animation segments. The…

  20. Elements of a realistic 17 GHz FEL/TBA design

    International Nuclear Information System (INIS)

    Hopkins, D.B.; Halbach, K.; Hoyer, E.H.; Sessler, A.M.; Sternbach, E.J.

    1989-01-01

    Recently, renewed interest in an FEL version of a two-beam accelerator (TBA) has prompted a study of practical system and structure designs for achieving the specified physics goals. This paper presents elements of a realistic design for an FEL/TBA suitable for a 1 TeV, 17 GHz linear collider. 13 refs., 8 figs., 2 tabs

  1. Realistic-contact-induced enhancement of rectifying in carbon-nanotube/graphene-nanoribbon junctions

    International Nuclear Information System (INIS)

    Zhang, Xiang-Hua; Li, Xiao-Fei; Wang, Ling-Ling; Xu, Liang; Luo, Kai-Wu

    2014-01-01

    Carbon-nanotube/graphene-nanoribbon junctions were recently fabricated by the controllable etching of single-walled carbon-nanotubes [Wei et al., Nat. Commun. 4, 1374 (2013)] and their electronic transport properties were studied here. First principles results reveal that the transmission function of the junctions show a heavy dependence on the shape of contacts, but rectifying is an inherent property which is insensitive to the details of contacts. Interestingly, the rectifying ratio is largely enhanced in the junction with a realistic contact and the enhancement is insensitive to the details of contact structures. The stability of rectifying suggests a significant feasibility to manufacture realistic all-carbon rectifiers in nanoelectronics

  2. Travel for the 2004 American Statistical Association Biannual Radiation Meeting: "Radiation in Realistic Environments: Interactions Between Radiation and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, David J.

    2009-07-21

    The 16th ASA Conference on Radiation and Health, held June 27-30, 2004 in Beaver Creek, CO, offered a unique forum for discussing research related to the effects of radiation exposures on human health in a multidisciplinary setting. The Conference furnishes investigators in health related disciplines the opportunity to learn about new quantitative approaches to their problems and furnishes statisticians the opportunity to learn about new applications for their discipline. The Conference was attended by about 60 scientists including statisticians, epidemiologists, biologists and physicists interested in radiation research. For the first time, ten recipients of Young Investigator Awards participated in the conference. The Conference began with a debate on the question: “Do radiation doses below 1 cGy increase cancer risks?” The keynote speaker was Dr. Martin Lavin, who gave a banquet presentation on the timely topic “How important is ATM?” The focus of the 2004 Conference on Radiation and Health was Radiation in Realistic Environments: Interactions Between Radiation and Other Risk Modifiers. The sessions of the conference included: Radiation, Smoking, and Lung Cancer Interactions of Radiation with Genetic Factors: ATM Radiation, Genetics, and Epigenetics Radiotherapeutic Interactions The Conference on Radiation and Health is held bi-annually, and participants are looking forward to the 17th conference to be held in 2006.

  3. The relative greenhouse gas impacts of realistic dietary choices

    International Nuclear Information System (INIS)

    Berners-Lee, M.; Hoolohan, C.; Cammack, H.; Hewitt, C.N.

    2012-01-01

    The greenhouse gas (GHG) emissions embodied in 61 different categories of food are used, with information on the diet of different groups of the population (omnivorous, vegetarian and vegan), to calculate the embodied GHG emissions in different dietary scenarios. We calculate that the embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 , or 2.7 t CO 2 e person −1 y −1 . This gives total food-related GHG emissions of 167 Mt CO 2 e (1 Mt=10 6 metric tonnes; CO 2 e being the mass of CO 2 that would have the same global warming potential, when measured over 100 years, as a given mixture of greenhouse gases) for the entire UK population in 2009. This is 27% of total direct GHG emissions in the UK, or 19% of total GHG emissions from the UK, including those embodied in goods produced abroad. We calculate that potential GHG savings of 22% and 26% can be made by changing from the current UK-average diet to a vegetarian or vegan diet, respectively. Taking the average GHG saving from six vegetarian or vegan dietary scenarios compared with the current UK-average diet gives a potential national GHG saving of 40 Mt CO 2 e y −1 . This is equivalent to a 50% reduction in current exhaust pipe emissions from the entire UK passenger car fleet. Hence realistic choices about diet can make substantial differences to embodied GHG emissions. - Highlights: ► We calculate the greenhouse gas emissions embodied in different diets. ► The embodied GHG content of the current UK food supply is 7.4 kg CO 2 e person −1 day −1 . ► Changing to a vegetarian or vegan diet reduces GHG emissions by 22–26%. ► Changing to a vegetarian or vegan diet would reduce UK GHG emissions by 40 Mt CO 2 e y −1 .

  4. Evaluation of Highly Realistic Training for Independent Duty Corpsmen Students

    Science.gov (United States)

    2015-05-21

    that he or she can perform desired actions or behaviors ( Bandura , 1977). In the present study, three types of self-efficacy were assessed: general...such as resilience. IDC Highly Realistic Training 10 REFERENCES Bandura , A (1977). Self-efficacy: Toward a unifying theory of behavioral

  5. Framing the Real: Lefèbvre and NeoRealist Cinematic Space as Practice

    OpenAIRE

    Brancaleone, David

    2014-01-01

    In 1945 Roberto Rossellini's Neo-realist Rome, Open City set in motion an approach to cinema and its representation of real life – and by extension real spaces – that was to have international significance in film theory and practice. However, the re-use of the real spaces of the city, and elsewhere, as film sets in Neo-realist film offered (and offers) more than an influential aesthetic and set of cinematic theories. Through Neo-realism, it can be argued that we gain access to a cinematic re...

  6. Estimation of Potential Population Level Effects of Contaminants on Wildlife; FINAL

    International Nuclear Information System (INIS)

    Loar, J.M.

    2001-01-01

    The objective of this project is to provide DOE with improved methods to assess risks from contaminants to wildlife populations. The current approach for wildlife risk assessment consists of comparison of contaminant exposure estimates for individual animals to literature-derived toxicity test endpoints. These test endpoints are assumed to estimate thresholds for population-level effects. Moreover, species sensitivities to contaminants is one of several criteria to be considered when selecting assessment endpoints (EPA 1997 and 1998), yet data on the sensitivities of many birds and mammals are lacking. The uncertainties associated with this approach are considerable. First, because toxicity data are not available for most potential wildlife endpoint species, extrapolation of toxicity data from test species to the species of interest is required. There is no consensus on the most appropriate extrapolation method. Second, toxicity data are represented as statistical measures (e.g., NOAEL s or LOAELs) that provide no information on the nature or magnitude of effects. The level of effect is an artifact of the replication and dosing regime employed, and does not indicate how effects might increase with increasing exposure. Consequently, slight exceedance of a LOAEL is not distinguished from greatly exceeding it. Third, the relationship of toxic effects on individuals to effects on populations is poorly estimated by existing methods. It is assumed that if the exposure of individuals exceeds levels associated with impaired reproduction, then population level effects are likely. Uncertainty associated with this assumption is large because depending on the reproductive strategy of a given species, comparable levels of reproductive impairment may result in dramatically different population-level responses. This project included several tasks to address these problems: (1) investigation of the validity of the current allometric scaling approach for interspecies extrapolation

  7. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  8. Bayesian inversion using a geologically realistic and discrete model space

    Science.gov (United States)

    Jaeggli, C.; Julien, S.; Renard, P.

    2017-12-01

    Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.

  9. Measurement of time delays in gated radiotherapy for realistic respiratory motions

    International Nuclear Information System (INIS)

    Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L.

    2014-01-01

    Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients

  10. Measurement of time delays in gated radiotherapy for realistic respiratory motions

    Energy Technology Data Exchange (ETDEWEB)

    Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L., E-mail: Wendy.Smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)

    2014-09-15

    Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients

  11. Response of secondary production and its components to multiple stressors in nematode field populations.

    NARCIS (Netherlands)

    Doroszuk, A.; Brake, te E.; Crespo-Gonzalez, D.; Kammenga, J.E.

    2007-01-01

    Realistic measures of the impact of individual or multiple stressors are important for ecological risk assessment. Although multiple anthropogenic stressors are common in human-dominated environments, knowledge of their influence on functional population parameters such as secondary production (P)

  12. Planetary populations in the mass-period diagram: A statistical treatment of exoplanet formation and the role of planet traps

    International Nuclear Information System (INIS)

    Hasegawa, Yasuhiro; Pudritz, Ralph E.

    2013-01-01

    The rapid growth of observed exoplanets has revealed the existence of several distinct planetary populations in the mass-period diagram. Two of the most surprising are (1) the concentration of gas giants around 1 AU and (2) the accumulation of a large number of low-mass planets with tight orbits, also known as super-Earths and hot Neptunes. We have recently shown that protoplanetary disks have multiple planet traps that are characterized by orbital radii in the disks and halt rapid type I planetary migration. By coupling planet traps with the standard core accretion scenario, we showed that one can account for the positions of planets in the mass-period diagram. In this paper, we demonstrate quantitatively that most gas giants formed at planet traps tend to end up around 1 AU, with most of these being contributed by dead zones and ice lines. We also show that a large fraction of super-Earths and hot Neptunes are formed as 'failed' cores of gas giants—this population being constituted by comparable contributions from dead zone and heat transition traps. Our results are based on the evolution of forming planets in an ensemble of disks where we vary only the lifetimes of disks and their mass accretion rates onto the host star. We show that a statistical treatment of the evolution of a large population of planetary cores caught in planet traps accounts for the existence of three distinct exoplanetary populations—the hot Jupiters, the more massive planets around r = 1 AU, and the short-period super-Earths and hot Neptunes. There are very few populations that feed into the large orbital radii characteristic of the imaged Jovian planet, which agrees with recent surveys. Finally, we find that low-mass planets in tight orbits become the dominant planetary population for low-mass stars (M * ≤ 0.7 M ☉ ).

  13. Differing effects of attention in single-units and populations are well predicted by heterogeneous tuning and the normalization model of attention.

    Science.gov (United States)

    Hara, Yuko; Pestilli, Franco; Gardner, Justin L

    2014-01-01

    Single-unit measurements have reported many different effects of attention on contrast-response (e.g., contrast-gain, response-gain, additive-offset dependent on visibility), while functional imaging measurements have more uniformly reported increases in response across all contrasts (additive-offset). The normalization model of attention elegantly predicts the diversity of effects of attention reported in single-units well-tuned to the stimulus, but what predictions does it make for more realistic populations of neurons with heterogeneous tuning? Are predictions in accordance with population-scale measurements? We used functional imaging data from humans to determine a realistic ratio of attention-field to stimulus-drive size (a key parameter for the model) and predicted effects of attention in a population of model neurons with heterogeneous tuning. We found that within the population, neurons well-tuned to the stimulus showed a response-gain effect, while less-well-tuned neurons showed a contrast-gain effect. Averaged across the population, these disparate effects of attention gave rise to additive-offsets in contrast-response, similar to reports in human functional imaging as well as population averages of single-units. Differences in predictions for single-units and populations were observed across a wide range of model parameters (ratios of attention-field to stimulus-drive size and the amount of baseline response modifiable by attention), offering an explanation for disparity in physiological reports. Thus, by accounting for heterogeneity in tuning of realistic neuronal populations, the normalization model of attention can not only predict responses of well-tuned neurons, but also the activity of large populations of neurons. More generally, computational models can unify physiological findings across different scales of measurement, and make links to behavior, but only if factors such as heterogeneous tuning within a population are properly accounted for.

  14. Realistic Planning Scenarios.

    Science.gov (United States)

    1987-07-01

    independent multiracial government, dominated primarily by the Zulu tribe and the local Asian population, had been proclaimed and aspired to control all of the...concentrated most of South Africa’s - remaining English-speaking population, and by the reigning Chief of the Zulu tribe , speaking for the self-styled...Africa. Facilities in one or more northern African countries-- Morocco, Egypt, Sudan, Kenya, Somalia--could be critical to U.S. military actions in the

  15. A Statistical Model for Generating a Population of Unclassified Objects and Radiation Signatures Spanning Nuclear Threats

    International Nuclear Information System (INIS)

    Nelson, K.; Sokkappa, P.

    2008-01-01

    This report describes an approach for generating a simulated population of plausible nuclear threat radiation signatures spanning a range of variability that could be encountered by radiation detection systems. In this approach, we develop a statistical model for generating random instances of smuggled nuclear material. The model is based on physics principles and bounding cases rather than on intelligence information or actual threat device designs. For this initial stage of work, we focus on random models using fissile material and do not address scenarios using non-fissile materials. The model has several uses. It may be used as a component in a radiation detection system performance simulation to generate threat samples for injection studies. It may also be used to generate a threat population to be used for training classification algorithms. In addition, we intend to use this model to generate an unclassified 'benchmark' threat population that can be openly shared with other organizations, including vendors, for use in radiation detection systems performance studies and algorithm development and evaluation activities. We assume that a quantity of fissile material is being smuggled into the country for final assembly and that shielding may have been placed around the fissile material. In terms of radiation signature, a nuclear weapon is basically a quantity of fissile material surrounded by various layers of shielding. Thus, our model of smuggled material is expected to span the space of potential nuclear weapon signatures as well. For computational efficiency, we use a generic 1-dimensional spherical model consisting of a fissile material core surrounded by various layers of shielding. The shielding layers and their configuration are defined such that the model can represent the potential range of attenuation and scattering that might occur. The materials in each layer and the associated parameters are selected from probability distributions that span the

  16. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  17. Non-Gaussianity and statistical anisotropy from vector field populated inflationary models

    CERN Document Server

    Dimastrogiovanni, Emanuela; Matarrese, Sabino; Riotto, Antonio

    2010-01-01

    We present a review of vector field models of inflation and, in particular, of the statistical anisotropy and non-Gaussianity predictions of models with SU(2) vector multiplets. Non-Abelian gauge groups introduce a richer amount of predictions compared to the Abelian ones, mostly because of the presence of vector fields self-interactions. Primordial vector fields can violate isotropy leaving their imprint in the comoving curvature fluctuations zeta at late times. We provide the analytic expressions of the correlation functions of zeta up to fourth order and an analysis of their amplitudes and shapes. The statistical anisotropy signatures expected in these models are important and, potentially, the anisotropic contributions to the bispectrum and the trispectrum can overcome the isotropic parts.

  18. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  19. A new statistical scission-point model fed with microscopic ingredients to predict fission fragments distributions

    International Nuclear Information System (INIS)

    Heinrich, S.

    2006-01-01

    Nucleus fission process is a very complex phenomenon and, even nowadays, no realistic models describing the overall process are available. The work presented here deals with a theoretical description of fission fragments distributions in mass, charge, energy and deformation. We have reconsidered and updated the B.D. Wilking Scission Point model. Our purpose was to test if this statistic model applied at the scission point and by introducing new results of modern microscopic calculations allows to describe quantitatively the fission fragments distributions. We calculate the surface energy available at the scission point as a function of the fragments deformations. This surface is obtained from a Hartree Fock Bogoliubov microscopic calculation which guarantee a realistic description of the potential dependence on the deformation for each fragment. The statistic balance is described by the level densities of the fragment. We have tried to avoid as much as possible the input of empirical parameters in the model. Our only parameter, the distance between each fragment at the scission point, is discussed by comparison with scission configuration obtained from full dynamical microscopic calculations. Also, the comparison between our results and experimental data is very satisfying and allow us to discuss the success and limitations of our approach. We finally proposed ideas to improve the model, in particular by applying dynamical corrections. (author)

  20. Realistic modeling of chamber transport for heavy-ion fusion

    International Nuclear Information System (INIS)

    Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.

    2003-01-01

    Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions

  1. Towards a Realist Sociology of Education: A Polyphonic Review Essay

    Science.gov (United States)

    Grenfell, Michael; Hood, Susan; Barrett, Brian D.; Schubert, Dan

    2017-01-01

    This review essay evaluates Karl Maton's "Knowledge and Knowers: Towards a Realist Sociology of Education" as a recent examination of the sociological causes and effects of education in the tradition of the French social theorist Pierre Bourdieu and the British educational sociologist Basil Bernstein. Maton's book synthesizes the…

  2. Place of a Realistic Teacher Education Pedagogy in an ICT ...

    African Journals Online (AJOL)

    This article is based on a study undertaken to examine the impact of introducing a realistic teacher education pedagogy (RTEP) oriented learning environment supported by ICT on distance teacher education in Uganda. It gives an overview of the quality, quantity and training of teachers in primary and secondary schools

  3. Microcomputer package for statistical analysis of microbial populations.

    Science.gov (United States)

    Lacroix, J M; Lavoie, M C

    1987-11-01

    We have developed a Pascal system to compare microbial populations from different ecological sites using microcomputers. The values calculated are: the coverage value and its standard error, the minimum similarity and the geometric similarity between two biological samples, and the Lambda test consisting of calculating the ratio of the mean similarity between two subsets by the mean similarity within subsets. This system is written for Apple II, IBM or compatible computers, but it can work for any computer which can use CP/M, if the programs are recompiled for such a system.

  4. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  5. Statistical intervals a guide for practitioners

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    Presents a detailed exposition of statistical intervals and emphasizes applications in industry. The discussion differentiates at an elementary level among different kinds of statistical intervals and gives instruction with numerous examples and simple math on how to construct such intervals from sample data. This includes confidence intervals to contain a population percentile, confidence intervals on probability of meeting specified threshold value, and prediction intervals to include observation in a future sample. Also has an appendix containing computer subroutines for nonparametric stati

  6. Statistical methods for including two-body forces in large system calculations

    International Nuclear Information System (INIS)

    Grimes, S.M.

    1980-07-01

    Large systems of interacting particles are often treated by assuming that the effect on any one particle of the remaining N-1 may be approximated by an average potential. This approach reduces the problem to that of finding the bound-state solutions for a particle in a potential; statistical mechanics is then used to obtain the properties of the many-body system. In some physical systems this approach may not be acceptable, because the two-body force component cannot be treated in this one-body limit. A technique for incorporating two-body forces in such calculations in a more realistic fashion is described. 1 figure

  7. A Statistical Framework for Microbial Source Attribution

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P; Allen, J E; Cunningham, C T

    2009-04-28

    This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or &apos

  8. Music therapy for palliative care: A realist review.

    Science.gov (United States)

    McConnell, Tracey; Porter, Sam

    2017-08-01

    Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therapy for palliative care. In keeping with the realist approach, we examined all relevant literature to develop theories that could explain how music therapy works. A total of 51 articles were included in the review. Music therapy was found to have a therapeutic effect on the physical, psychological, emotional, and spiritual suffering of palliative care patients. We also identified program mechanisms that help explain music therapy's therapeutic effects, along with facilitating contexts for implementation. Music therapy may be an effective nonpharmacological approach to managing distressing symptoms in palliative care patients. The findings also suggest that group music therapy may be a cost-efficient and effective way to support staff caring for palliative care patients. We encourage others to continue developing the evidence base in order to expand our understanding of how music therapy works, with the aim of informing and improving the provision of music therapy for palliative care patients.

  9. A Case Study in Elementary Statistics: The Florida Panther Population

    Science.gov (United States)

    Lazowski, Andrew; Stopper, Geffrey

    2013-01-01

    We describe a case study that was created to intertwine the fields of biology and mathematics. This project is given in an elementary probability and statistics course for non-math majors. Some goals of this case study include: to expose students to biology in a math course, to apply probability to real-life situations, and to display how far a…

  10. Phenomenology of a realistic accelerating universe using tracker fields

    Indian Academy of Sciences (India)

    We present a realistic scenario of tracking of scalar fields with varying equation of state. The astrophysical constraints on the evolution of scalar fields in the physical universe are discussed. The nucleosynthesis and the galaxy formation constraints have been used to put limits on and estimate during cosmic evolution.

  11. Empirical Evidence for Niss' "Implemented Anticipation" in Mathematising Realistic Situations

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2012-01-01

    Mathematisation of realistic situations is an on-going focus of research. Classroom data from a Year 9 class participating in a program of structured modelling of real situations was analysed for evidence of Niss's theoretical construct, implemented anticipation, during mathematisation. Evidence was found for two of three proposed aspects. In…

  12. Rehand: Realistic electric prosthetic hand created with a 3D printer.

    Science.gov (United States)

    Yoshikawa, Masahiro; Sato, Ryo; Higashihara, Takanori; Ogasawara, Tsukasa; Kawashima, Noritaka

    2015-01-01

    Myoelectric prosthetic hands provide an appearance with five fingers and a grasping function to forearm amputees. However, they have problems in weight, appearance, and cost. This paper reports on the Rehand, a realistic electric prosthetic hand created with a 3D printer. It provides a realistic appearance that is same as the cosmetic prosthetic hand and a grasping function. A simple link mechanism with one linear actuator for grasping and 3D printed parts achieve low cost, light weight, and ease of maintenance. An operating system based on a distance sensor provides a natural operability equivalent to the myoelectric control system. A supporter socket allows them to wear the prosthetic hand easily. An evaluation using the Southampton Hand Assessment Procedure (SHAP) demonstrated that an amputee was able to operate various objects and do everyday activities with the Rehand.

  13. A scan for models with realistic fermion mass patterns

    International Nuclear Information System (INIS)

    Bijnens, J.; Wetterich, C.

    1986-03-01

    We consider models which have no small Yukawa couplings unrelated to symmetry. This situation is generic in higher dimensional unification where Yukawa couplings are predicted to have strength similar to the gauge couplings. Generations have then to be differentiated by symmetry properties and the structure of fermion mass matrices is given in terms of quantum numbers alone. We scan possible symmetries leading to realistic mass matrices. (orig.)

  14. Stochastic population dynamics under resource constraints

    Energy Technology Data Exchange (ETDEWEB)

    Gavane, Ajinkya S., E-mail: ajinkyagavane@gmail.com; Nigam, Rahul, E-mail: rahul.nigam@hyderabad.bits-pilani.ac.in [BITS Pilani Hyderabad Campus, Shameerpet, Hyd - 500078 (India)

    2016-06-02

    This paper investigates the population growth of a certain species in which every generation reproduces thrice over a period of predefined time, under certain constraints of resources needed for survival of population. We study the survival period of a species by randomizing the reproduction probabilities within a window at same predefined ages and the resources are being produced by the working force of the population at a variable rate. This randomness in the reproduction rate makes the population growth stochastic in nature and one cannot predict the exact form of evolution. Hence we study the growth by running simulations for such a population and taking an ensemble averaged over 500 to 5000 such simulations as per the need. While the population reproduces in a stochastic manner, we have implemented a constraint on the amount of resources available for the population. This is important to make the simulations more realistic. The rate of resource production then is tuned to find the rate which suits the survival of the species. We also compute the mean life time of the species corresponding to different resource production rate. Study for these outcomes in the parameter space defined by the reproduction probabilities and rate of resource production is carried out.

  15. Reinventing Sex: The Construction of Realistic Definitions of Sex and Gender.

    Science.gov (United States)

    Small, Chanley M.

    1998-01-01

    Presents a set of criteria for constructing a fair and realistic understanding of sex. Recognizes the impact that science can have on social policies and values and recommends that the definitions of sex and gender be carefully crafted. (DDR)

  16. Computational investigation of nonlinear microwave tomography on anatomically realistic breast phantoms

    DEFF Research Database (Denmark)

    Jensen, P. D.; Rubæk, Tonny; Mohr, J. J.

    2013-01-01

    The performance of a nonlinear microwave tomography algorithm is tested using simulated data from anatomically realistic breast phantoms. These tests include several different anatomically correct breast models from the University of Wisconsin-Madison repository with and without tumors inserted....

  17. Micromechanical Modeling of Fiber-Reinforced Composites with Statistically Equivalent Random Fiber Distribution

    Directory of Open Access Journals (Sweden)

    Wenzhi Wang

    2016-07-01

    Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.

  18. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    NARCIS (Netherlands)

    Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.

    2011-01-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This

  19. Using realist synthesis to understand the mechanisms of interprofessional teamwork in health and social care.

    Science.gov (United States)

    Hewitt, Gillian; Sims, Sarah; Harris, Ruth

    2014-11-01

    Realist synthesis offers a novel and innovative way to interrogate the large literature on interprofessional teamwork in health and social care teams. This article introduces realist synthesis and its approach to identifying and testing the underpinning processes (or "mechanisms") that make an intervention work, the contexts that trigger those mechanisms and their subsequent outcomes. A realist synthesis of the evidence on interprofessional teamwork is described. Thirteen mechanisms were identified in the synthesis and findings for one mechanism, called "Support and value" are presented in this paper. The evidence for the other twelve mechanisms ("collaboration and coordination", "pooling of resources", "individual learning", "role blurring", "efficient, open and equitable communication", "tactical communication", "shared responsibility and influence", "team behavioural norms", "shared responsibility and influence", "critically reviewing performance and decisions", "generating and implementing new ideas" and "leadership") are reported in a further three papers in this series. The "support and value" mechanism referred to the ways in which team members supported one another, respected other's skills and abilities and valued each other's contributions. "Support and value" was present in some, but far from all, teams and a number of contexts that explained this variation were identified. The article concludes with a discussion of the challenges and benefits of undertaking this realist synthesis.

  20. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  1. Blending critical realist and emancipatory practice development methodologies: making critical realism work in nursing research.

    LENUS (Irish Health Repository)

    Parlour, Randal

    2012-12-01

    This paper examines the efficacy of facilitation as a practice development intervention in changing practice within an Older Person setting and in implementing evidence into practice. It outlines the influences exerted by the critical realist paradigm in guiding emancipatory practice development activities and, in particular, how the former may be employed within an emancipatory practice development study to elucidate and increase understanding pertinent to causation and outcomes. The methodology is based upon an emancipatory practice development approach set within a realistic evaluation framework. This allows for systematic analysis of the social and contextual elements that influence the explication of outcomes associated with facilitation. The study is concentrated upon five practice development cycles, within which a sequence of iterative processes is integrated. The authors assert that combining critical realist and emancipatory processes offers a robust and practical method for translating evidence and implementing changes in practice, as the former affirms or falsifies the influence that emancipatory processes exert on attaining culture shift, and enabling transformation towards effective clinical practice. A new framework for practice development is proposed that establishes methodological coherency between emancipatory practice development and realistic evaluation. This augments the existing theoretical bases for both these approaches by contributing new theoretical and methodological understandings of causation.

  2. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  3. ObamaNet: Photo-realistic lip-sync from text

    OpenAIRE

    Kumar, Rithesh; Sotelo, Jose; Kumar, Kundan; de Brebisson, Alexandre; Bengio, Yoshua

    2017-01-01

    We present ObamaNet, the first architecture that generates both audio and synchronized photo-realistic lip-sync videos from any new text. Contrary to other published lip-sync approaches, ours is only composed of fully trainable neural modules and does not rely on any traditional computer graphics methods. More precisely, we use three main modules: a text-to-speech network based on Char2Wav, a time-delayed LSTM to generate mouth-keypoints synced to the audio, and a network based on Pix2Pix to ...

  4. Ultra-Reliable Communications in Failure-Prone Realistic Networks

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Lauridsen, Mads; Alvarez, Beatriz Soret

    2016-01-01

    We investigate the potential of different diversity and interference management techniques to achieve the required downlink SINR outage probability for ultra-reliable communications. The evaluation is performed in a realistic network deployment based on site-specific data from a European capital....... Micro and macroscopic diversity techniques are proved to be important enablers of ultra-reliable communications. Particularly, it is shown how a 4x4 MIMO scheme with three orders of macroscopic diversity can achieve the required SINR outage performance. Smaller gains are obtained from interference...

  5. Realist Stronghold in the Land of Thucydides? - Appraising and Resisting a Realist Tradition in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos Mikelis

    2015-10-01

    Full Text Available Given the integration of the discipline of International Relations in Greece into the global discipline since a few decades, the article addresses the reflection of the ‘realism in and for the globe’ question to this specific case. Although the argument doesn’t go as far as to ‘recover’ forgotten IR theorists or self-proclaimed realists, a geopolitical dimension of socio-economic thought during interwar addressed concerns which could be related to the intricacies of realpolitik. Then again at current times, certain scholars have been eager to maintain a firm stance in favor of realism, focusing on the work of ancient figures, especially Thucydides or Homer, and on questions of the offensive-defensive realism debate as well as on the connection with the English School, while others have offered fruitful insights matching the broad constructivist agenda. Overall, certain genuine arguments have appeared, reflecting diversified views about sovereignty and its function or mitigation.

  6. Realistically Rendering SoC Traffic Patterns with Interrupt Awareness

    DEFF Research Database (Denmark)

    Angiolini, Frederico; Mahadevan, Sharkar; Madsen, Jan

    2005-01-01

    to generate realistic test traffic. This paper presents a selection of applications using interrupt-based synchronization; a reference methodology to split such applications in execution subflows and to adjust the overall execution stream based upon hardware events; a reactive simulation device capable...... of correctly replicating such software behaviours in the MPSoC design phase. Additionally, we validate the proposed concept by showing cycle-accurate reproduction of a previously traced application flow....

  7. Statistical properties of antisymmetrized molecular dynamics for non-nucleon-emission and nucleon-emission processes

    International Nuclear Information System (INIS)

    Ono, A.; Horiuchi, H.

    1996-01-01

    Statistical properties of antisymmetrized molecular dynamics (AMD) are classical in the case of nucleon-emission processes, while they are quantum mechanical for the processes without nucleon emission. In order to understand this situation, we first clarify that there coexist mutually opposite two statistics in the AMD framework: One is the classical statistics of the motion of wave packet centroids and the other is the quantum statistics of the motion of wave packets which is described by the AMD wave function. We prove the classical statistics of wave packet centroids by using the framework of the microcanonical ensemble of the nuclear system with a realistic effective two-nucleon interaction. We show that the relation between the classical statistics of wave packet centroids and the quantum statistics of wave packets can be obtained by taking into account the effects of the wave packet spread. This relation clarifies how the quantum statistics of wave packets emerges from the classical statistics of wave packet centroids. It is emphasized that the temperature of the classical statistics of wave packet centroids is different from the temperature of the quantum statistics of wave packets. We then explain that the statistical properties of AMD for nucleon-emission processes are classical because nucleon-emission processes in AMD are described by the motion of wave packet centroids. We further show that when we improve the description of the nucleon-emission process so as to take into account the momentum fluctuation due to the wave packet spread, the AMD statistical properties for nucleon-emission processes change drastically into quantum statistics. Our study of nucleon-emission processes can be conversely regarded as giving another kind of proof of the fact that the statistics of wave packets is quantum mechanical while that of wave packet centroids is classical. copyright 1996 The American Physical Society

  8. Realistic Simulations of Coronagraphic Observations with WFIRST

    Science.gov (United States)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  9. UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Mueller, Matthias; Casser, Vincent; Lahoud, Jean; Smith, Neil; Ghanem, Bernard

    2017-01-01

    We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  10. UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Mueller, Matthias

    2017-08-19

    We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  11. Sim4CV: A Photo-Realistic Simulator for Computer Vision Applications

    KAUST Repository

    Müller, Matthias

    2018-03-24

    We present a photo-realistic training and evaluation simulator (Sim4CV) (http://www.sim4cv.org) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.

  12. Operator representation for effective realistic interactions

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Dennis; Feldmeier, Hans; Neff, Thomas [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany)

    2013-07-01

    We present a method to derive an operator representation from the partial wave matrix elements of effective realistic nucleon-nucleon potentials. This method allows to employ modern effective interactions, which are mostly given in matrix element representation, also in nuclear many-body methods requiring explicitly the operator representation, for example ''Fermionic Molecular Dynamics'' (FMD). We present results for the operator representation of effective interactions obtained from the Argonne V18 potential with the Uenitary Correlation Operator Method'' (UCOM) and the ''Similarity Renormalization Group'' (SRG). Moreover, the operator representation allows a better insight in the nonlocal structure of the potential: While the UCOM transformed potential only shows a quadratic momentum dependence, the momentum dependence of SRG transformed potentials is beyond such a simple polynomial form.

  13. Design of a digital phantom population for myocardial perfusion SPECT imaging research

    International Nuclear Information System (INIS)

    Ghaly, Michael; Du, Yong; Fung, George S K; Tsui, Benjamin M W; Frey, Eric; Links, Jonathan M

    2014-01-01

    Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48–184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as

  14. First-Generation Transgenic Plants and Statistics

    NARCIS (Netherlands)

    Nap, Jan-Peter; Keizer, Paul; Jansen, Ritsert

    1993-01-01

    The statistical analyses of populations of first-generation transgenic plants are commonly based on mean and variance and generally require a test of normality. Since in many cases the assumptions of normality are not met, analyses can result in erroneous conclusions. Transformation of data to

  15. A Generalized Pyramid Matching Kernel for Human Action Recognition in Realistic Videos

    Directory of Open Access Journals (Sweden)

    Wenjun Zhang

    2013-10-01

    Full Text Available Human action recognition is an increasingly important research topic in the fields of video sensing, analysis and understanding. Caused by unconstrained sensing conditions, there exist large intra-class variations and inter-class ambiguities in realistic videos, which hinder the improvement of recognition performance for recent vision-based action recognition systems. In this paper, we propose a generalized pyramid matching kernel (GPMK for recognizing human actions in realistic videos, based on a multi-channel “bag of words” representation constructed from local spatial-temporal features of video clips. As an extension to the spatial-temporal pyramid matching (STPM kernel, the GPMK leverages heterogeneous visual cues in multiple feature descriptor types and spatial-temporal grid granularity levels, to build a valid similarity metric between two video clips for kernel-based classification. Instead of the predefined and fixed weights used in STPM, we present a simple, yet effective, method to compute adaptive channel weights of GPMK based on the kernel target alignment from training data. It incorporates prior knowledge and the data-driven information of different channels in a principled way. The experimental results on three challenging video datasets (i.e., Hollywood2, Youtube and HMDB51 validate the superiority of our GPMK w.r.t. the traditional STPM kernel for realistic human action recognition and outperform the state-of-the-art results in the literature.

  16. Improving UWB-Based Localization in IoT Scenarios with Statistical Models of Distance Error.

    Science.gov (United States)

    Monica, Stefania; Ferrari, Gianluigi

    2018-05-17

    Interest in the Internet of Things (IoT) is rapidly increasing, as the number of connected devices is exponentially growing. One of the application scenarios envisaged for IoT technologies involves indoor localization and context awareness. In this paper, we focus on a localization approach that relies on a particular type of communication technology, namely Ultra Wide Band (UWB). UWB technology is an attractive choice for indoor localization, owing to its high accuracy. Since localization algorithms typically rely on estimated inter-node distances, the goal of this paper is to evaluate the improvement brought by a simple (linear) statistical model of the distance error. On the basis of an extensive experimental measurement campaign, we propose a general analytical framework, based on a Least Square (LS) method, to derive a novel statistical model for the range estimation error between a pair of UWB nodes. The proposed statistical model is then applied to improve the performance of a few illustrative localization algorithms in various realistic scenarios. The obtained experimental results show that the use of the proposed statistical model improves the accuracy of the considered localization algorithms with a reduction of the localization error up to 66%.

  17. Statistical methods with applications to demography and life insurance

    CERN Document Server

    Khmaladze, Estáte V

    2013-01-01

    Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Itô integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addi...

  18. Report of the workshop on realistic SSC lattices

    International Nuclear Information System (INIS)

    1985-10-01

    A workshop was held at the SSC Central Design Group from May 29 to June 4, 1985, on topics relating to the lattice of the SSC. The workshop marked a shift of emphasis from the investigation of simplified test lattices to the development of a realistic lattice suitable for the conceptual design report. The first day of the workshop was taken up by reviews of accelerator system requirements, of the reference design solutions for these requirements, of lattice work following the reference design, and of plans for the workshop. The work was divided among four working groups. The first, chaired by David Douglas, concerned the arcs of regular cells. The second group, which studied the utility insertions, was chaired by Beat Leemann. The third group, under David E. Johnson, concerned itself with the experimental insertions, dispersion suppressors, and phase trombones. The fourth group, responsible for global lattice considerations and the design of a new realistic lattice example, was led by Ernest Courant. The papers resulting from this workshop are roughly divided into three sets: those relating to specific lattice components, to complete lattices, and to other topics. Among the salient accomplishments of the workshop were additions to and optimization of lattice components, especially those relating to lattices using 1-in-1 magnets, either horizontally or vertically separated, and the design of complete lattice examples. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  19. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  20. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations...

  1. Planetary populations in the mass-period diagram: A statistical treatment of exoplanet formation and the role of planet traps

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Yasuhiro [Currently EACOA Fellow at Institute of Astronomy and Astrophysics, Academia Sinica (ASIAA), Taipei 10641, Taiwan. (China); Pudritz, Ralph E., E-mail: yasu@asiaa.sinica.edu.tw, E-mail: pudritz@physics.mcmaster.ca [Also at Origins Institute, McMaster University, Hamilton, ON L8S 4M1, Canada. (Canada)

    2013-11-20

    The rapid growth of observed exoplanets has revealed the existence of several distinct planetary populations in the mass-period diagram. Two of the most surprising are (1) the concentration of gas giants around 1 AU and (2) the accumulation of a large number of low-mass planets with tight orbits, also known as super-Earths and hot Neptunes. We have recently shown that protoplanetary disks have multiple planet traps that are characterized by orbital radii in the disks and halt rapid type I planetary migration. By coupling planet traps with the standard core accretion scenario, we showed that one can account for the positions of planets in the mass-period diagram. In this paper, we demonstrate quantitatively that most gas giants formed at planet traps tend to end up around 1 AU, with most of these being contributed by dead zones and ice lines. We also show that a large fraction of super-Earths and hot Neptunes are formed as 'failed' cores of gas giants—this population being constituted by comparable contributions from dead zone and heat transition traps. Our results are based on the evolution of forming planets in an ensemble of disks where we vary only the lifetimes of disks and their mass accretion rates onto the host star. We show that a statistical treatment of the evolution of a large population of planetary cores caught in planet traps accounts for the existence of three distinct exoplanetary populations—the hot Jupiters, the more massive planets around r = 1 AU, and the short-period super-Earths and hot Neptunes. There are very few populations that feed into the large orbital radii characteristic of the imaged Jovian planet, which agrees with recent surveys. Finally, we find that low-mass planets in tight orbits become the dominant planetary population for low-mass stars (M {sub *} ≤ 0.7 M {sub ☉}).

  2. Neural Correlates of Realistic and Unrealistic Auditory Space Perception

    Directory of Open Access Journals (Sweden)

    Akiko Callan

    2011-10-01

    Full Text Available Binaural recordings can simulate externalized auditory space perception over headphones. However, if the orientation of the recorder's head and the orientation of the listener's head are incongruent, the simulated auditory space is not realistic. For example, if a person lying flat on a bed listens to an environmental sound that was recorded by microphones inserted in ears of a person who was in an upright position, the sound simulates an auditory space rotated 90 degrees to the real-world horizontal axis. Our question is whether brain activation patterns are different between the unrealistic auditory space (ie, the orientation of the listener's head and the orientation of the recorder's head are incongruent and the realistic auditory space (ie, the orientations are congruent. River sounds that were binaurally recorded either in a supine position or in an upright body position were served as auditory stimuli. During fMRI experiments, participants listen to the stimuli and pressed one of two buttons indicating the direction of the water flow (horizontal/vertical. Behavioral results indicated that participants could not differentiate between the congruent and the incongruent conditions. However, neuroimaging results showed that the congruent condition activated the planum temporale significantly more than the incongruent condition.

  3. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    Science.gov (United States)

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  4. Statistical methods for spatio-temporal systems

    CERN Document Server

    Finkenstadt, Barbel

    2006-01-01

    Statistical Methods for Spatio-Temporal Systems presents current statistical research issues on spatio-temporal data modeling and will promote advances in research and a greater understanding between the mechanistic and the statistical modeling communities.Contributed by leading researchers in the field, each self-contained chapter starts with an introduction of the topic and progresses to recent research results. Presenting specific examples of epidemic data of bovine tuberculosis, gastroenteric disease, and the U.K. foot-and-mouth outbreak, the first chapter uses stochastic models, such as point process models, to provide the probabilistic backbone that facilitates statistical inference from data. The next chapter discusses the critical issue of modeling random growth objects in diverse biological systems, such as bacteria colonies, tumors, and plant populations. The subsequent chapter examines data transformation tools using examples from ecology and air quality data, followed by a chapter on space-time co...

  5. Constraining the Statistics of Population III Binaries

    Science.gov (United States)

    Stacy, Athena; Bromm, Volker

    2012-01-01

    We perform a cosmological simulation in order to model the growth and evolution of Population III (Pop III) stellar systems in a range of host minihalo environments. A Pop III multiple system forms in each of the ten minihaloes, and the overall mass function is top-heavy compared to the currently observed initial mass function in the Milky Way. Using a sink particle to represent each growing protostar, we examine the binary characteristics of the multiple systems, resolving orbits on scales as small as 20 AU. We find a binary fraction of approx. 36, with semi-major axes as large as 3000 AU. The distribution of orbital periods is slightly peaked at approx. < 900 yr, while the distribution of mass ratios is relatively flat. Of all sink particles formed within the ten minihaloes, approx. 50 are lost to mergers with larger sinks, and 50 of the remaining sinks are ejected from their star-forming disks. The large binary fraction may have important implications for Pop III evolution and nucleosynthesis, as well as the final fate of the first stars.

  6. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada.

    Science.gov (United States)

    Smylie, Janet; Firestone, Michelle

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.

  7. Gene flow analysis method, the D-statistic, is robust in a wide parameter space.

    Science.gov (United States)

    Zheng, Yichen; Janke, Axel

    2018-01-08

    We evaluated the sensitivity of the D-statistic, a parsimony-like method widely used to detect gene flow between closely related species. This method has been applied to a variety of taxa with a wide range of divergence times. However, its parameter space and thus its applicability to a wide taxonomic range has not been systematically studied. Divergence time, population size, time of gene flow, distance of outgroup and number of loci were examined in a sensitivity analysis. The sensitivity study shows that the primary determinant of the D-statistic is the relative population size, i.e. the population size scaled by the number of generations since divergence. This is consistent with the fact that the main confounding factor in gene flow detection is incomplete lineage sorting by diluting the signal. The sensitivity of the D-statistic is also affected by the direction of gene flow, size and number of loci. In addition, we examined the ability of the f-statistics, [Formula: see text] and [Formula: see text], to estimate the fraction of a genome affected by gene flow; while these statistics are difficult to implement to practical questions in biology due to lack of knowledge of when the gene flow happened, they can be used to compare datasets with identical or similar demographic background. The D-statistic, as a method to detect gene flow, is robust against a wide range of genetic distances (divergence times) but it is sensitive to population size. The D-statistic should only be applied with critical reservation to taxa where population sizes are large relative to branch lengths in generations.

  8. Effect of geometry on concentration polarization in realistic heterogeneous permselective systems

    Science.gov (United States)

    Green, Yoav; Shloush, Shahar; Yossifon, Gilad

    2014-04-01

    This study extends previous analytical solutions of concentration polarization occurring solely in the depleted region, to the more realistic geometry consisting of a three-dimensional (3D) heterogeneous ion-permselective medium connecting two opposite microchambers (i.e., a three-layer system). Under the local electroneutrality approximation, the separation of variable methods is used to derive an analytical solution of the electrodiffusive problem for the two opposing asymmetric microchambers. The assumption of an ideal permselective medium allows for the analytic calculation of the 3D concentration and electric potential distributions as well as a current-voltage relation. It is shown that any asymmetry in the microchamber geometries will result in current rectification. Moreover, it is demonstrated that for non-negligible microchamber resistances, the conductance does not exhibit the expected saturation at low concentrations but instead shows a continuous decrease. The results are intended to facilitate a more direct comparison between theory and experiments, as now the voltage drop is across a realistic 3D and three-layer system.

  9. Realistic nuclear shell theory and the doubly-magic 132Sn region

    International Nuclear Information System (INIS)

    Vary, J.P.

    1978-01-01

    After an introduction discussing the motivation and interest in results obtained with isotope separators, the fundamental problem in realistic nuclear shell theory is posed in the context of renormalization theory. Then some of the important developments that have occurred over the last fifteen years in the derivation of the effective Hamiltonian and application of realistic nuclear shell theory are briefly reviewed. Doubly magic regions of the periodic table and the unique advantages of the 132 Sn region are described. Then results are shown for the ground-state properties of 132 Sn as calculated from the density-dependent Hartree-Fock approach with the Skyrme Hamiltonian. A single theoretical Hamiltonian for all nuclei from doubly magic 132 Sn to doubly magic 208 Pb is presented; single-particle energies are graphed. Finally, predictions of shell-model level-density distributions obtained with spectral distribution methods are discussed; calculated level densities are shown for 136 Xe. 10 figures

  10. A Realistic Human Exposure Assessment of Indoor Radon released from Groundwater

    International Nuclear Information System (INIS)

    Yu, Dong Han; Han, Moon Hee

    2002-01-01

    The work presents a realistic human exposure assessment of indoor radon released from groundwater in a house. At first, a two-compartment model is developed to describe the generation and transfer of radon in indoor air from groundwater. The model is used to estimate radon concentrations profile of indoor air in a house using by showering, washing clothes, and flushing toilets. Then, the study performs an uncertainty analysis of model input parameters to quantify the uncertainty in radon concentration profile. In order to estimate a daily internal dose of a specific tissue group in an adult through the inhalation of such indoor radon, a PBPK(Physiologically-Based Pharmaco-Kinetic) model is developed. Combining indoor radon profile and PBPK model is used to a realistic human assessment for such exposure. The results obtained from this study would be used to the evaluation of human risk by inhalation associated with the indoor radon released from groundwater

  11. Realistic Paleobathymetry of the Cenomanian–Turonian (94 Ma Boundary Global Ocean

    Directory of Open Access Journals (Sweden)

    Arghya Goswami

    2018-01-01

    Full Text Available At present, global paleoclimate simulations are prepared with bathtub-like, flat, featureless and steep walled ocean bathymetry, which is neither realistic nor suitable. In this article, we present the first enhanced version of a reconstructed paleobathymetry for Cenomanian–Turonian (94 Ma time in a 0.1° × 0.1° resolution, that is both realistic and suitable for use in paleo-climate studies. This reconstruction is an extrapolation of a parameterized modern ocean bathymetry that combines simple geophysical models (standard plate cooling model for the oceanic lithosphere based on ocean crustal age, global modern oceanic sediment thicknesses, and generalized shelf-slope-rise structures calibrated from a published global relief model of the modern world (ETOPO1 at active and passive continental margins. The base version of this Cenomanian–Turonian paleobathymetry reconstruction is then updated with known submarine large igneous provinces, plateaus, and seamounts to minimize the difference between the reconstructed paleobathymetry and the real bathymetry that once existed.

  12. Book Review: A Liberal Actor in a Realist World the European Union ...

    African Journals Online (AJOL)

    Abstract. Book Title: A Liberal Actor in a Realist World the European Union Regulatory State and the Global Political Economy of Energy. Book Author: Andreas Goldthau & Nick Sitter. Oxford University Press Oxford 2015. ISBN 9780198719595 ...

  13. Highly Realistic Training for Navy Corpsmen: A Follow-up Assessment

    Science.gov (United States)

    2017-10-12

    based training for military medical providers. One such training is highly realistic training. Based on the success of the Infantry Immersion ...observation with minimal participation improves paediatric emergency medicine knowledge, skills and confidence. Emergency Medicine Journal , 32(3), 195... immersive training for Navy corpsmen: Preliminary results. Military Medicine, 179(12), 1439–1443. Booth-Kewley, S., McWhorter, S. K., Dell’Acqua, R

  14. Realistic microscopic level densities for spherical nuclei

    International Nuclear Information System (INIS)

    Cerf, N.

    1994-01-01

    Nuclear level densities play an important role in nuclear reactions such as the formation of the compound nucleus. We develop a microscopic calculation of the level density based on a combinatorial evaluation from a realistic single-particle level scheme. This calculation makes use of a fast Monte Carlo algorithm allowing us to consider large shell model spaces which could not be treated previously in combinatorial approaches. Since our model relies on a microscopic basis, it can be applied to exotic nuclei with more confidence than the commonly used semiphenomenological formuals. An exhaustive comparison of our predicted neutron s-wave resonance spacings with experimental data for a wide range of nuclei is presented

  15. Statistical basis for positive identification in forensic anthropology.

    Science.gov (United States)

    Steadman, Dawnie Wolfe; Adams, Bradley J; Konigsberg, Lyle W

    2006-09-01

    Forensic scientists are often expected to present the likelihood of DNA identifications in US courts based on comparative population data, yet forensic anthropologists tend not to quantify the strength of an osteological identification. Because forensic anthropologists are trained first and foremost as physical anthropologists, they emphasize estimation problems at the expense of evidentiary problems, but this approach must be reexamined. In this paper, the statistical bases for presenting osteological and dental evidence are outlined, using a forensic case as a motivating example. A brief overview of Bayesian statistics is provided, and methods to calculate likelihood ratios for five aspects of the biological profile are demonstrated. This paper emphasizes the definition of appropriate reference samples and of the "population at large," and points out the conceptual differences between them. Several databases are introduced for both reference information and to characterize the "population at large," and new data are compiled to calculate the frequency of specific characters, such as age or fractures, within the "population at large." Despite small individual likelihood ratios for age, sex, and stature in the case example, the power of this approach is that, assuming each likelihood ratio is independent, the product rule can be applied. In this particular example, it is over three million times more likely to obtain the observed osteological and dental data if the identification is correct than if the identification is incorrect. This likelihood ratio is a convincing statistic that can support the forensic anthropologist's opinion on personal identity in court. 2006 Wiley-Liss, Inc.

  16. On the spatio-temporal and energy-dependent response of riometer absorption to electron precipitation: drift-time and conjunction analyses in realistic electric and magnetic fields

    Science.gov (United States)

    Kellerman, Adam; Shprits, Yuri; Makarevich, Roman; Donovan, Eric; Zhu, Hui

    2017-04-01

    Riometers are low-cost passive radiowave instruments located in both northern and southern hemispheres that capable of operating during quiet and disturbed conditions. Many instruments have been operating continuously for multiple solar cycles, making them a useful tool for long-term statistical studies and for real-time analysis and forecasting of space weather. Here we present recent and new analyses of the relationship between the riometer-measured cosmic noise absorption and electron precipitation into the D-region and lower E-region ionosphere. We utilize two techniques: a drift-time analysis in realistic electric and magnetic field models, where a particle is traced from one location to another, and the energy determined by the time delay between similar observations; and a conjunction analysis, where we directly compare precipitated fluxes from THEMIS and Van Allen Probes with the riometer absorption. In both cases we present a statistical analysis of the response of riometer absorption to electron precipitation as a function of MLAT, MLT, and geomagnetic conditions.

  17. A study of statistics anxiety levels of graduate dental hygiene students.

    Science.gov (United States)

    Welch, Paul S; Jacks, Mary E; Smiley, Lynn A; Walden, Carolyn E; Clark, William D; Nguyen, Carol A

    2015-02-01

    In light of increased emphasis on evidence-based practice in the profession of dental hygiene, it is important that today's dental hygienist comprehend statistical measures to fully understand research articles, and thereby apply scientific evidence to practice. Therefore, the purpose of this study was to investigate statistics anxiety among graduate dental hygiene students in the U.S. A web-based self-report, anonymous survey was emailed to directors of 17 MSDH programs in the U.S. with a request to distribute to graduate students. The survey collected data on statistics anxiety, sociodemographic characteristics and evidence-based practice. Statistic anxiety was assessed using the Statistical Anxiety Rating Scale. Study significance level was α=0.05. Only 8 of the 17 invited programs participated in the study. Statistical Anxiety Rating Scale data revealed graduate dental hygiene students experience low to moderate levels of statistics anxiety. Specifically, the level of anxiety on the Interpretation Anxiety factor indicated this population could struggle with making sense of scientific research. A decisive majority (92%) of students indicated statistics is essential for evidence-based practice and should be a required course for all dental hygienists. This study served to identify statistics anxiety in a previously unexplored population. The findings should be useful in both theory building and in practical applications. Furthermore, the results can be used to direct future research. Copyright © 2015 The American Dental Hygienists’ Association.

  18. Lessons learned in using realist evaluation to assess maternal and newborn health programming in rural Bangladesh.

    Science.gov (United States)

    Adams, Alayne; Sedalia, Saroj; McNab, Shanon; Sarker, Malabika

    2016-03-01

    Realist evaluation furnishes valuable insight to public health practitioners and policy makers about how and why interventions work or don't work. Moving beyond binary measures of success or failure, it provides a systematic approach to understanding what goes on in the 'Black Box' and how implementation decisions in real life contexts can affect intervention effectiveness. This paper reflects on an experience in applying the tenets of realist evaluation to identify optimal implementation strategies for scale-up of Maternal and Newborn Health (MNH) programmes in rural Bangladesh. Supported by UNICEF, the three MNH programmes under consideration employed different implementation models to deliver similar services and meet similar MNH goals. Programme targets included adoption of recommended antenatal, post-natal and essential newborn care practices; health systems strengthening through improved referral, accountability and administrative systems, and increased community knowledge. Drawing on focused examples from this research, seven steps for operationalizing the realist evaluation approach are offered, while emphasizing the need to iterate and innovate in terms of methods and analysis strategies. The paper concludes by reflecting on lessons learned in applying realist evaluation, and the unique insights it yields regarding implementation strategies for successful MNH programming. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  19. Armageddon polytropos. Realistic thinking and nuclear fact, views on a half-century of inter-paradigmatic debates

    International Nuclear Information System (INIS)

    Zajec, Olivier

    2017-01-01

    At the end of the Second World War, nuclear weapons immediately became a privileged topic of study for the nascent discipline of International Relations (RI). Within it, the realist school has attempted to integrate nuclear weapons into a general strategic reflection on the evolution of interstate foreign policy. So deeply indeed that, according to a common opinion, a kind of fusion occurred between the realist paradigm and nuclear power: in short, 'defending' nuclear deterrence would mean making profession of realism. However, the link between realism and nuclear power is more complex than is commonly thought. To grasp its nuances, it is probably necessary to draw a parallel between the concrete and practical evolution of the Western military nuclear doctrines, on the one hand, and, on the other hand, the theoretical and conceptual positions taken by realistic politicians from 1945 to the present day. It is the object of this article, which aims to recall the articulations between the realistic thought of the IR and the nuclear fact, in order to put them in historical perspective, so as to suggest that the 'interlocking web of thought' that is realism, far from being monolithic, has had - and continues to have - multiple nuclear facets

  20. Using remotely sensed data and stochastic models to simulate realistic flood hazard footprints across the continental US

    Science.gov (United States)

    Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.

    2017-12-01

    Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in

  1. Impact of muscular uptake and statistical noise on tumor quantification based on simulated FDG-PET studies

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro

    2017-01-01

    Purpose: The aim of this work is to study the effect of physiological muscular uptake variations and statistical noise on tumor quantification in FDG-PET studies. Methods: We designed a realistic framework based on simulated FDG-PET acquisitions from an anthropomorphic phantom that included different muscular uptake levels and three spherical lung lesions with diameters of 31, 21 and 9 mm. A distribution of muscular uptake levels was obtained from 136 patients remitted to our center for whole-body FDG-PET. Simulated FDG-PET acquisitions were obtained by using the Simulation System for Emission Tomography package (SimSET) Monte Carlo package. Simulated data was reconstructed by using an iterative Ordered Subset Expectation Maximization (OSEM) algorithm implemented in the Software for Tomographic Image Reconstruction (STIR) library. Tumor quantification was carried out by using estimations of SUV max , SUV 50 and SUV mean from different noise realizations, lung lesions and multiple muscular uptakes. Results: Our analysis provided quantification variability values of 17–22% (SUV max ), 11–19% (SUV 50 ) and 8–10% (SUV mean ) when muscular uptake variations and statistical noise were included. Meanwhile, quantification variability due only to statistical noise was 7–8% (SUV max ), 3–7% (SUV 50 ) and 1–2% (SUV mean ) for large tumors (>20 mm) and 13% (SUV max ), 16% (SUV 50 ) and 8% (SUV mean ) for small tumors (<10 mm), thus showing that the variability in tumor quantification is mainly affected by muscular uptake variations when large enough tumors are considered. In addition, our results showed that quantification variability is strongly dominated by statistical noise when the injected dose decreases below 222 MBq. Conclusions: Our study revealed that muscular uptake variations between patients who are totally relaxed should be considered as an uncertainty source of tumor quantification values. - Highlights: • Distribution of muscular uptake from 136 PET

  2. A linear evolution for non-linear dynamics and correlations in realistic nuclei

    International Nuclear Information System (INIS)

    Levin, E.; Lublinsky, M.

    2004-01-01

    A new approach to high energy evolution based on a linear equation for QCD generating functional is developed. This approach opens a possibility for systematic study of correlations inside targets, and, in particular, inside realistic nuclei. Our results are presented as three new equations. The first one is a linear equation for QCD generating functional (and for scattering amplitude) that sums the 'fan' diagrams. For the amplitude this equation is equivalent to the non-linear Balitsky-Kovchegov equation. The second equation is a generalization of the Balitsky-Kovchegov non-linear equation to interactions with realistic nuclei. It includes a new correlation parameter which incorporates, in a model-dependent way, correlations inside the nuclei. The third equation is a non-linear equation for QCD generating functional (and for scattering amplitude) that in addition to the 'fan' diagrams sums the Glauber-Mueller multiple rescatterings

  3. Per Aspera ad Astra: Through Complex Population Modeling to Predictive Theory.

    Science.gov (United States)

    Topping, Christopher J; Alrøe, Hugo Fjelsted; Farrell, Katharine N; Grimm, Volker

    2015-11-01

    Population models in ecology are often not good at predictions, even if they are complex and seem to be realistic enough. The reason for this might be that Occam's razor, which is key for minimal models exploring ideas and concepts, has been too uncritically adopted for more realistic models of systems. This can tie models too closely to certain situations, thereby preventing them from predicting the response to new conditions. We therefore advocate a new kind of parsimony to improve the application of Occam's razor. This new parsimony balances two contrasting strategies for avoiding errors in modeling: avoiding inclusion of nonessential factors (false inclusions) and avoiding exclusion of sometimes-important factors (false exclusions). It involves a synthesis of traditional modeling and analysis, used to describe the essentials of mechanistic relationships, with elements that are included in a model because they have been reported to be or can arguably be assumed to be important under certain conditions. The resulting models should be able to reflect how the internal organization of populations change and thereby generate representations of the novel behavior necessary for complex predictions, including regime shifts.

  4. Multivariate statistical analysis for x-ray photoelectron spectroscopy spectral imaging: Effect of image acquisition time

    International Nuclear Information System (INIS)

    Peebles, D.E.; Ohlhausen, J.A.; Kotula, P.G.; Hutton, S.; Blomfield, C.

    2004-01-01

    The acquisition of spectral images for x-ray photoelectron spectroscopy (XPS) is a relatively new approach, although it has been used with other analytical spectroscopy tools for some time. This technique provides full spectral information at every pixel of an image, in order to provide a complete chemical mapping of the imaged surface area. Multivariate statistical analysis techniques applied to the spectral image data allow the determination of chemical component species, and their distribution and concentrations, with minimal data acquisition and processing times. Some of these statistical techniques have proven to be very robust and efficient methods for deriving physically realistic chemical components without input by the user other than the spectral matrix itself. The benefits of multivariate analysis of the spectral image data include significantly improved signal to noise, improved image contrast and intensity uniformity, and improved spatial resolution - which are achieved due to the effective statistical aggregation of the large number of often noisy data points in the image. This work demonstrates the improvements in chemical component determination and contrast, signal-to-noise level, and spatial resolution that can be obtained by the application of multivariate statistical analysis to XPS spectral images

  5. Modeling the Impact of Baryons on Subhalo Populations with Machine Learning

    Science.gov (United States)

    Nadler, Ethan O.; Mao, Yao-Yuan; Wechsler, Risa H.; Garrison-Kimmel, Shea; Wetzel, Andrew

    2018-06-01

    We identify subhalos in dark matter–only (DMO) zoom-in simulations that are likely to be disrupted due to baryonic effects by using a random forest classifier trained on two hydrodynamic simulations of Milky Way (MW)–mass host halos from the Latte suite of the Feedback in Realistic Environments (FIRE) project. We train our classifier using five properties of each disrupted and surviving subhalo: pericentric distance and scale factor at first pericentric passage after accretion and scale factor, virial mass, and maximum circular velocity at accretion. Our five-property classifier identifies disrupted subhalos in the FIRE simulations with an 85% out-of-bag classification score. We predict surviving subhalo populations in DMO simulations of the FIRE host halos, finding excellent agreement with the hydrodynamic results; in particular, our classifier outperforms DMO zoom-in simulations that include the gravitational potential of the central galactic disk in each hydrodynamic simulation, indicating that it captures both the dynamical effects of a central disk and additional baryonic physics. We also predict surviving subhalo populations for a suite of DMO zoom-in simulations of MW-mass host halos, finding that baryons impact each system consistently and that the predicted amount of subhalo disruption is larger than the host-to-host scatter among the subhalo populations. Although the small size and specific baryonic physics prescription of our training set limits the generality of our results, our work suggests that machine-learning classification algorithms trained on hydrodynamic zoom-in simulations can efficiently predict realistic subhalo populations.

  6. Development of Contextual Mathematics teaching Material integrated related sciences and realistic for students grade xi senior high school

    Science.gov (United States)

    Helma, H.; Mirna, M.; Edizon, E.

    2018-04-01

    Mathematics is often applied in physics, chemistry, economics, engineering, and others. Besides that, mathematics is also used in everyday life. Learning mathematics in school should be associated with other sciences and everyday life. In this way, the learning of mathematics is more realstic, interesting, and meaningful. Needs analysis shows that required contextual mathematics teaching materials integrated related sciences and realistic on learning mathematics. The purpose of research is to produce a valid and practical contextual mathematics teaching material integrated related sciences and realistic. This research is development research. The result of this research is a valid and practical contextual mathematics teaching material integrated related sciences and realistic produced

  7. Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention

    National Research Council Canada - National Science Library

    Itti, L; Dhavale, N; Pighin, F

    2003-01-01

    We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained...

  8. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    ... images is performed Then triangular, quadratic meshes are formed for the interfaces of the tissues, Thus, realistic meshes, representing scalp, skull, CSF, brain and eye tissues, are formed, At least...

  9. Protocol for a realist review of workplace learning in postgraduate medical education and training.

    Science.gov (United States)

    Wiese, Anel; Kilty, Caroline; Bergin, Colm; Flood, Patrick; Fu, Na; Horgan, Mary; Higgins, Agnes; Maher, Bridget; O'Kane, Grainne; Prihodova, Lucia; Slattery, Dubhfeasa; Bennett, Deirdre

    2017-01-19

    Postgraduate medical education and training (PGMET) is a complex social process which happens predominantly during the delivery of patient care. The clinical learning environment (CLE), the context for PGMET, shapes the development of the doctors who learn and work within it, ultimately impacting the quality and safety of patient care. Clinical workplaces are complex, dynamic systems in which learning emerges from non-linear interactions within a network of related factors and activities. Those tasked with the design and delivery of postgraduate medical education and training need to understand the relationship between the processes of medical workplace learning and these contextual elements in order to optimise conditions for learning. We propose to conduct a realist synthesis of the literature to address the overarching questions; how, why and in what circumstances do doctors learn in clinical environments? This review is part of a funded projected with the overall aim of producing guidelines and recommendations for the design of high quality clinical learning environments for postgraduate medical education and training. We have chosen realist synthesis as a methodology because of its suitability for researching complexity and producing answers useful to policymakers and practitioners. This realist synthesis will follow the steps and procedures outlined by Wong et al. in the RAMESES Publication Standards for Realist Synthesis and the Realist Synthesis RAMESES Training Materials. The core research team is a multi-disciplinary group of researchers, clinicians and health professions educators. The wider research group includes experts in organisational behaviour and human resources management as well as the key stakeholders; doctors in training, patient representatives and providers of PGMET. This study will draw from the published literature and programme, and substantive, theories of workplace learning, to describe context, mechanism and outcome configurations for

  10. Sensitivity of the Hydrogen Epoch of Reionization Array and its build-out stages to one-point statistics from redshifted 21 cm observations

    Science.gov (United States)

    Kittiwisit, Piyanat; Bowman, Judd D.; Jacobs, Daniel C.; Beardsley, Adam P.; Thyagarajan, Nithyanandan

    2018-03-01

    We present a baseline sensitivity analysis of the Hydrogen Epoch of Reionization Array (HERA) and its build-out stages to one-point statistics (variance, skewness, and kurtosis) of redshifted 21 cm intensity fluctuation from the Epoch of Reionization (EoR) based on realistic mock observations. By developing a full-sky 21 cm light-cone model, taking into account the proper field of view and frequency bandwidth, utilizing a realistic measurement scheme, and assuming perfect foreground removal, we show that HERA will be able to recover statistics of the sky model with high sensitivity by averaging over measurements from multiple fields. All build-out stages will be able to detect variance, while skewness and kurtosis should be detectable for HERA128 and larger. We identify sample variance as the limiting constraint of the measurements at the end of reionization. The sensitivity can also be further improved by performing frequency windowing. In addition, we find that strong sample variance fluctuation in the kurtosis measured from an individual field of observation indicates the presence of outlying cold or hot regions in the underlying fluctuations, a feature that can potentially be used as an EoR bubble indicator.

  11. Dynamic apeerture in damping rings with realistic wigglers

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yunhai; /SLAC

    2005-05-04

    The International Linear Collider based on superconducting RF cavities requires the damping rings to have extremely small equilibrium emittance, huge circumference, fast damping time, and large acceptance. To achieve all of these requirements is a very challenging task. In this paper, we will present a systematic approach to designing the damping rings using simple cells and non-interlaced sextupoles. The designs of the damping rings with various circumferences and shapes, including dogbone, are presented. To model realistic wigglers, we have developed a new hybrid symplectic integrator for faster and accurate evaluation of dynamic aperture of the lattices.

  12. Do absorption and realistic distraction influence performance of component task surgical procedure?

    NARCIS (Netherlands)

    Pluyter, J.R.; Buzink, S.N.; Rutkowski, A.F.; Jakimowicz, J.J.

    2009-01-01

    Background. Surgeons perform complex tasks while exposed to multiple distracting sources that may increase stress in the operating room (e.g., music, conversation, and unadapted use of sophisticated technologies). This study aimed to examine whether such realistic social and technological

  13. Fractional populations in multiple gene inheritance.

    Science.gov (United States)

    Chung, Myung-Hoon; Kim, Chul Koo; Nahm, Kyun

    2003-01-22

    With complete knowledge of the human genome sequence, one of the most interesting tasks remaining is to understand the functions of individual genes and how they communicate. Using the information about genes (locus, allele, mutation rate, fitness, etc.), we attempt to explain population demographic data. This population evolution study could complement and enhance biologists' understanding about genes. We present a general approach to study population genetics in complex situations. In the present approach, multiple allele inheritance, multiple loci inheritance, natural selection and mutations are allowed simultaneously in order to consider a more realistic situation. A simulation program is presented so that readers can readily carry out studies with their own parameters. It is shown that the multiplicity of the loci greatly affects the demographic results of fractional population ratios. Furthermore, the study indicates that some high infant mortality rates due to congenital anomalies can be attributed to multiple loci inheritance. The simulation program can be downloaded from http://won.hongik.ac.kr/~mhchung/index_files/yapop.htm. In order to run this program, one needs Visual Studio.NET platform, which can be downloaded from http://msdn.microsoft.com/netframework/downloads/default.asp.

  14. Development of Realistic Head Models for Electromagnetic Source Imaging of the Human Brain

    National Research Council Canada - National Science Library

    Akalin, Z

    2001-01-01

    In this work, a methodology is developed to solve the forward problem of electromagnetic source imaging using realistic head models, For this purpose, first segmentation of the 3 dimensional MR head...

  15. Level density from realistic nuclear potentials

    International Nuclear Information System (INIS)

    Calboreanu, A.

    2006-01-01

    Nuclear level density of some nuclei is calculated using a realistic set of single particle states (sps). These states are derived from the parameterization of nuclear potentials that describe the observed sps over a large number of nuclei. This approach has the advantage that one can infer level density for nuclei that are inaccessible for a direct study, but are very important in astrophysical processes such as those close to the drip lines. Level densities at high excitation energies are very sensitive to the actual set of sps. The fact that the sps spectrum is finite has extraordinary consequences upon nuclear reaction yields due to the leveling-off of the level density at extremely high excitation energies wrongly attributed so far to other nuclear effects. Single-particle level density parameter a parameter is extracted by fitting the calculated densities to the standard Bethe formula

  16. Falling in the elderly: Do statistical models matter for performance criteria of fall prediction? Results from two large population-based studies.

    Science.gov (United States)

    Kabeshova, Anastasiia; Launay, Cyrille P; Gromov, Vasilii A; Fantino, Bruno; Levinoff, Elise J; Allali, Gilles; Beauchet, Olivier

    2016-01-01

    To compare performance criteria (i.e., sensitivity, specificity, positive predictive value, negative predictive value, area under receiver operating characteristic curve and accuracy) of linear and non-linear statistical models for fall risk in older community-dwellers. Participants were recruited in two large population-based studies, "Prévention des Chutes, Réseau 4" (PCR4, n=1760, cross-sectional design, retrospective collection of falls) and "Prévention des Chutes Personnes Agées" (PCPA, n=1765, cohort design, prospective collection of falls). Six linear statistical models (i.e., logistic regression, discriminant analysis, Bayes network algorithm, decision tree, random forest, boosted trees), three non-linear statistical models corresponding to artificial neural networks (multilayer perceptron, genetic algorithm and neuroevolution of augmenting topologies [NEAT]) and the adaptive neuro fuzzy interference system (ANFIS) were used. Falls ≥1 characterizing fallers and falls ≥2 characterizing recurrent fallers were used as outcomes. Data of studies were analyzed separately and together. NEAT and ANFIS had better performance criteria compared to other models. The highest performance criteria were reported with NEAT when using PCR4 database and falls ≥1, and with both NEAT and ANFIS when pooling data together and using falls ≥2. However, sensitivity and specificity were unbalanced. Sensitivity was higher than specificity when identifying fallers, whereas the converse was found when predicting recurrent fallers. Our results showed that NEAT and ANFIS were non-linear statistical models with the best performance criteria for the prediction of falls but their sensitivity and specificity were unbalanced, underscoring that models should be used respectively for the screening of fallers and the diagnosis of recurrent fallers. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  17. Genetic structure of populations and differentiation in forest trees

    Science.gov (United States)

    Raymond P. Guries; F. Thomas Ledig

    1981-01-01

    Electrophoretic techniques permit population biologists to analyze genetic structure of natural populations by using large numbers of allozyme loci. Several methods of analysis have been applied to allozyme data, including chi-square contingency tests, F-statistics, and genetic distance. This paper compares such statistics for pitch pine (Pinus rigida...

  18. Electron spin polarization in realistic trajectories around the magnetic node of two counter-propagating, circularly polarized, ultra-intense lasers

    Science.gov (United States)

    Del Sorbo, D.; Seipt, D.; Thomas, A. G. R.; Ridgers, C. P.

    2018-06-01

    It has recently been suggested that two counter-propagating, circularly polarized, ultra-intense lasers can induce a strong electron spin polarization at the magnetic node of the electromagnetic field that they setup (Del Sorbo et al 2017 Phys. Rev. A 96 043407). We confirm these results by considering a more sophisticated description that integrates over realistic trajectories. The electron dynamics is weakly affected by the variation of power radiated due to the spin polarization. The degree of spin polarization differs by approximately 5% if considering electrons initially at rest or already in a circular orbit. The instability of trajectories at the magnetic node induces a spin precession associated with the electron migration that establishes an upper temporal limit to the polarization of the electron population of about one laser period.

  19. Statistical and quantitative research

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Environmental impacts may escape detection if the statistical tests used to analyze data from field studies are inadequate or the field design is not appropriate. To alleviate this problem, PNL scientists are doing theoretical research which will provide the basis for new sampling schemes or better methods to analyze and present data. Such efforts have resulted in recommendations about the optimal size of study plots, sampling intensity, field replication, and program duration. Costs associated with any of these factors can be substantial if, for example, attention is not paid to the adequacy of a sampling scheme. In the study of dynamics of large-mammal populations, the findings are sometimes surprising. For example, the survival of a grizzly bear population may hinge on the loss of one or two adult females per year

  20. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    statement is true. In some cases statistical aspects of safety are misused, where the number of runs for several outputs is correct only for statistically independent outputs, or misinterpreted. We do not know the probability distribution of the output variables subjected to safety limitations. At the same time in some asymmetric distributions the 0.95/0.95 methodology simply fails: if we repeat the calculations in many cases we would get a value higher than the basic value, which means the limit violation in the calculation becomes more and more probable in the repeated analysis. Consequent application of order statistics or the application of the sign test may offer a way out of the present situation. The authors are also convinced that efforts should be made to study the statistics of the output variables, and to study the occurrence of chaos in the analyzed cases. All these observations should influence, in safety analysis, the application of best estimate methods, and underline the opinion that any realistic modeling and simulation of complex systems must include the probabilistic features of the system and the environment

  1. Digital communication between clinician and patient and the impact on marginalised groups: a realist review in general practice.

    Science.gov (United States)

    Huxley, Caroline J; Atherton, Helen; Watkins, Jocelyn Anstey; Griffiths, Frances

    2015-12-01

    Increasingly, the NHS is embracing the use of digital communication technology for communication between clinicians and patients. Policymakers deem digital clinical communication as presenting a solution to the capacity issues currently faced by general practice. There is some concern that these technologies may exacerbate existing inequalities in accessing health care. It is not known what impact they may have on groups who are already marginalised in their ability to access general practice. To assess the potential impact of the availability of digital clinician-patient communication on marginalised groups' access to general practice in the UK. Realist review in general practice. A four-step realist review process was used: to define the scope of the review; to search for and scrutinise evidence; to extract and synthesise evidence; and to develop a narrative, including hypotheses. Digital communication has the potential to overcome the following barriers for marginalised groups: practical access issues, previous negative experiences with healthcare service/staff, and stigmatising reactions from staff and other patients. It may reduce patient-related barriers by offering anonymity and offers advantages to patients who require an interpreter. It does not impact on inability to communicate with healthcare professionals or on a lack of candidacy. It is likely to work best in the context of a pre-existing clinician-patient relationship. Digital communication technology offers increased opportunities for marginalised groups to access health care. However, it cannot remove all barriers to care for these groups. It is likely that they will remain disadvantaged relative to other population groups after their introduction. © British Journal of General Practice 2015.

  2. Simplified realistic human head model for simulating Tumor Treating Fields (TTFields).

    Science.gov (United States)

    Wenger, Cornelia; Bomzon, Ze'ev; Salvador, Ricardo; Basser, Peter J; Miranda, Pedro C

    2016-08-01

    Tumor Treating Fields (TTFields) are alternating electric fields in the intermediate frequency range (100-300 kHz) of low-intensity (1-3 V/cm). TTFields are an anti-mitotic treatment against solid tumors, which are approved for Glioblastoma Multiforme (GBM) patients. These electric fields are induced non-invasively by transducer arrays placed directly on the patient's scalp. Cell culture experiments showed that treatment efficacy is dependent on the induced field intensity. In clinical practice, a software called NovoTalTM uses head measurements to estimate the optimal array placement to maximize the electric field delivery to the tumor. Computational studies predict an increase in the tumor's electric field strength when adapting transducer arrays to its location. Ideally, a personalized head model could be created for each patient, to calculate the electric field distribution for the specific situation. Thus, the optimal transducer layout could be inferred from field calculation rather than distance measurements. Nonetheless, creating realistic head models of patients is time-consuming and often needs user interaction, because automated image segmentation is prone to failure. This study presents a first approach to creating simplified head models consisting of convex hulls of the tissue layers. The model is able to account for anisotropic conductivity in the cortical tissues by using a tensor representation estimated from Diffusion Tensor Imaging. The induced electric field distribution is compared in the simplified and realistic head models. The average field intensities in the brain and tumor are generally slightly higher in the realistic head model, with a maximal ratio of 114% for a simplified model with reasonable layer thicknesses. Thus, the present pipeline is a fast and efficient means towards personalized head models with less complexity involved in characterizing tissue interfaces, while enabling accurate predictions of electric field distribution.

  3. [Analysis of the definitive statistics of the 11th General Population and Housing Census].

    Science.gov (United States)

    Aguayo Hernandez, J R

    1992-01-01

    The 11th General Census of Population and Housing conducted in March 1990 enumerated 2,204,054 inhabitants in Sinaloa, for a density of 37.9 per sq. km. Sinaloa's population thus increased sevenfold from 297,000 in 1900. The proportion of Sinalioans in Mexico's population increased from 2.2% in 1900 to 2.7% in 1990. 38.4% of the population was under age 14, 57.0% was 14064, and 4.6% as over 65. The greatest challenge for the year 2010 will be to meet the demand for educational facilities, employment, and services for the growing elderly population. Sinaloa's population grew at an annual rate of 1.1 between 1980-90. 17 of its 18 municipios showed slowing growth rates between 1980-90, with only Escuinapa increasing its rate. Sinaloa's growth rate of 1.8% is still relatively high, and the population in the year 2000 is projected at 2.6 million. Population distribution and migration present problems that should be more actively addressed. Urban-urban migration is increasing in importance. In 1990, Sinaloa had 5247 localities of which only 85 had more than 2500 inhabitants and 4717 had fewer than 500. Growth of midsize localities with 500-2499 inhabitants may constitute an alternative allowing the demographic deconcentration and decentralization that Sinaloa urgently requires. The lack of jobs, infrastructure, educational and health services, housing, and food in the dispersed 4717 communities with fewer than 500 inhabitants makes them sources of emigration. Sinaloa's population is concentrated along the coast and in the 3 valleys of the north and central regions, which contain 80.8% of the population. One-third of the population lives on 12.1% of the territory in 2 municipios, while 12 municipios covering 67% of the territory contain just 24% of the population. Sinaloa's growth rate has declined from 4.3% between 1960-70 to 3.7% from 1970-80 and 1.8% in 1980-90.

  4. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  5. Realistic modelling of the seismic input: Site effects and parametric studies

    International Nuclear Information System (INIS)

    Romanelli, F.; Vaccari, F.; Panza, G.F.

    2002-11-01

    We illustrate the work done in the framework of a large international cooperation, showing the very recent numerical experiments carried out within the framework of the EC project 'Advanced methods for assessing the seismic vulnerability of existing motorway bridges' (VAB) to assess the importance of non-synchronous seismic excitation of long structures. The definition of the seismic input at the Warth bridge site, i.e. the determination of the seismic ground motion due to an earthquake with a given magnitude and epicentral distance from the site, has been done following a theoretical approach. In order to perform an accurate and realistic estimate of site effects and of differential motion it is necessary to make a parametric study that takes into account the complex combination of the source and propagation parameters, in realistic geological structures. The computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different sources and structural models, allows us the construction of damage scenarios that are out of the reach of stochastic models, at a very low cost/benefit ratio. (author)

  6. The costs, effects and cost-effectiveness of counteracting overweight on a population level. A scientific base for policy targets for the Dutch national plan for action.

    NARCIS (Netherlands)

    Bemelmans, W.; Baal, van P.; Wendel-Vos, G.C.W.; Schuit, J.; Feskens, E.J.M.; Ament, A.; Hoogenveen, R.

    2008-01-01

    Objectives. To gain insight in realistic policy targets for overweight at a population level and the accompanying costs. Therefore, the effect on overweight prevalence was estimated of large scale implementation of a community intervention (applied to 90% of general population) and an intensive

  7. EFFECT OF MEASUREMENT ERRORS ON PREDICTED COSMOLOGICAL CONSTRAINTS FROM SHEAR PEAK STATISTICS WITH LARGE SYNOPTIC SURVEY TELESCOPE

    Energy Technology Data Exchange (ETDEWEB)

    Bard, D.; Chang, C.; Kahn, S. M.; Gilmore, K.; Marshall, S. [KIPAC, Stanford University, 452 Lomita Mall, Stanford, CA 94309 (United States); Kratochvil, J. M.; Huffenberger, K. M. [Department of Physics, University of Miami, Coral Gables, FL 33124 (United States); May, M. [Physics Department, Brookhaven National Laboratory, Upton, NY 11973 (United States); AlSayyad, Y.; Connolly, A.; Gibson, R. R.; Jones, L.; Krughoff, S. [Department of Astronomy, University of Washington, Seattle, WA 98195 (United States); Ahmad, Z.; Bankert, J.; Grace, E.; Hannel, M.; Lorenz, S. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Haiman, Z.; Jernigan, J. G., E-mail: djbard@slac.stanford.edu [Department of Astronomy and Astrophysics, Columbia University, New York, NY 10027 (United States); and others

    2013-09-01

    We study the effect of galaxy shape measurement errors on predicted cosmological constraints from the statistics of shear peak counts with the Large Synoptic Survey Telescope (LSST). We use the LSST Image Simulator in combination with cosmological N-body simulations to model realistic shear maps for different cosmological models. We include both galaxy shape noise and, for the first time, measurement errors on galaxy shapes. We find that the measurement errors considered have relatively little impact on the constraining power of shear peak counts for LSST.

  8. The statistics of Pearce element diagrams and the Chayes closure problem

    Science.gov (United States)

    Nicholls, J.

    1988-05-01

    Pearce element ratios are defined as having a constituent in their denominator that is conserved in a system undergoing change. The presence of a conserved element in the denominator simplifies the statistics of such ratios and renders them subject to statistical tests, especially tests of significance of the correlation coefficient between Pearce element ratios. Pearce element ratio diagrams provide unambigous tests of petrologic hypotheses because they are based on the stoichiometry of rock-forming minerals. There are three ways to recognize a conserved element: 1. The petrologic behavior of the element can be used to select conserved ones. They are usually the incompatible elements. 2. The ratio of two conserved elements will be constant in a comagmatic suite. 3. An element ratio diagram that is not constructed with a conserved element in the denominator will have a trend with a near zero intercept. The last two criteria can be tested statistically. The significance of the slope, intercept and correlation coefficient can be tested by estimating the probability of obtaining the observed values from a random population of arrays. This population of arrays must satisfy two criteria: 1. The population must contain at least one array that has the means and variances of the array of analytical data for the rock suite. 2. Arrays with the means and variances of the data must not be so abundant in the population that nearly every array selected at random has the properties of the data. The population of random closed arrays can be obtained from a population of open arrays whose elements are randomly selected from probability distributions. The means and variances of these probability distributions are themselves selected from probability distributions which have means and variances equal to a hypothetical open array that would give the means and variances of the data on closure. This hypothetical open array is called the Chayes array. Alternatively, the population of

  9. The effects of interventions targeting multiple health behaviors on smoking cessation outcomes: a rapid realist review protocol.

    Science.gov (United States)

    Minian, Nadia; deRuiter, Wayne K; Lingam, Mathangee; Corrin, Tricia; Dragonetti, Rosa; Manson, Heather; Taylor, Valerie H; Zawertailo, Laurie; Ebnahmady, Arezoo; Melamed, Osnat C; Rodak, Terri; Hahn, Margaret; Selby, Peter

    2018-03-01

    Health behaviors directly impact the health of individuals, and populations. Since individuals tend to engage in multiple unhealthy behaviors such as smoking, excessive alcohol use, physical inactivity, and eating an unhealthy diet simultaneously, many large community-based interventions have been implemented to reduce the burden of disease through the modification of multiple health behaviors. Smoking cessation can be particularly challenging as the odds of becoming dependent on nicotine increase with every unhealthy behavior a smoker exhibits. This paper presents a protocol for a rapid realist review which aims to identify factors associated with effectively changing tobacco use and target two or more additional unhealthy behaviors. An electronic literature search will be conducted using the following bibliographic databases: MEDLINE, Embase, PsycINFO, Cumulative Index to Nursing and Allied Health Literature (CINAHL), The Cochrane Library, Social Science Abstracts, Social Work Abstracts, and Web of Science. Two reviewers will screen titles and abstracts for relevant research, and the selected full papers will be used to extract data and assess the quality of evidence. Throughout this process, the rapid realist approach proposed by Saul et al., 2013 will be used to refine our initial program theory and identify contextual factors and mechanisms that are associated with successful multiple health behavior change. This review will provide evidence-based research on the context and mechanisms that may drive the success or failure of interventions designed to support multiple health behavior change. This information will be used to guide curriculum and program development for a government funded project on improving smoking cessation by addressing multiple health behaviors in people in Canada. PROSPERO CRD42017064430.

  10. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  11. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  12. Statistical aspects of tumor registries, Hiroshima and Nagasaki

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M

    1961-02-24

    Statistical considerations are presented on the tumor registries established for purpose of studying radiation induced carcinoma in Hiroshima and Nagasaki by observing tumors developing in the survivors of these cities. In addition to describing the background and purpose of the tumor registries the report consists of two parts: (1) accuracy of reported tumor cases and (2) statistical aspects of the incidence of tumors based both on a current population and on a fixed sample. Under the heading background, discussion includes the difficulties in attaining complete registration; the various problems associated with the tumor registries; and the special characteristics of tumor registries in Hiroshima and Nagasaki. Beye's a posteriori probability formula was applied to the Type I and Type II errors in the autopsy data of Hiroshima ABCC. (Type I, diagnosis of what is not cancer as cancer; Type II, diagnosis of what is cancer as noncancer.) Finally, the report discussed the difficulties in estimating a current population of survivors; the advantages and disadvantages of analyses based on a fixed sample and on an estimated current population; the comparison of incidence rates based on these populations using the 20 months' data of the tumor registry in Hiroshima; and the sample size required for studying radiation induced carcinoma. 10 references, 1 figure, 8 tables.

  13. Developing a learning environment on realistic mathematics education for Indonesian student teachers

    NARCIS (Netherlands)

    Zulkardi, Z.

    2002-01-01

    The CASCADE-IMEI study was started to explore the role of a learning environment (LE) in assisting mathematics student teachers learning Realistic Mathematics Education (RME) as a new instructional approach in mathematics education in Indonesia. The LE for this study has been developed and evaluated

  14. Fabrication of a set of realistic torso phantoms for calibration of transuranic nuclide lung counting facilities

    International Nuclear Information System (INIS)

    Griffith, R.V.; Anderson, A.L.; Sundbeck, C.W.; Alderson, S.W.

    1983-01-01

    A set of 16 tissue equivalent torso phantoms has been fabricated for use by major laboratories involved in counting transuranic nuclides in the lung. These phantoms, which have bone equivalent plastic rib cages, duplicate the performance of the DOE Realistic Phantom set. The new phantoms (and their successors) provide the user laboratories with a highly realistic calibration tool. Moreover, use of these phantoms will allow participating laboratories to intercompare calibration information, both on formal and informal bases. 3 refs., 2 figs

  15. Process informed accurate compact modelling of 14-nm FinFET variability and application to statistical 6T-SRAM simulations

    OpenAIRE

    Wang, Xingsheng; Reid, Dave; Wang, Liping; Millar, Campbell; Burenkov, Alex; Evanschitzky, Peter; Baer, Eberhard; Lorenz, Juergen; Asenov, Asen

    2016-01-01

    This paper presents a TCAD based design technology co-optimization (DTCO) process for 14nm SOI FinFET based SRAM, which employs an enhanced variability aware compact modeling approach that fully takes process and lithography simulations and their impact on 6T-SRAM layout into account. Realistic double patterned gates and fins and their impacts are taken into account in the development of the variability-aware compact model. Finally, global process induced variability and local statistical var...

  16. Dynamic Enhanced Inter-Cell Interference Coordination for Realistic Networks

    DEFF Research Database (Denmark)

    Pedersen, Klaus I.; Alvarez, Beatriz Soret; Barcos, Sonia

    2016-01-01

    Enhanced Inter-Cell Interference Coordination (eICIC) is a key ingredient to boost the performance of co-channel Heterogeneous Networks (HetNets). eICIC encompasses two main techniques: Almost Blank Subframes (ABS), during which the macro cell remains silent to reduce the interference, and biased...... and an opportunistic approach exploiting the varying cell conditions. Moreover, an autonomous fast distributed muting algorithm is presented, which is simple, robust, and well suited for irregular network deployments. Performance results for realistic network deployments show that the traditional semi-static e...

  17. The large break LOCA evaluation method with the simplified statistic approach

    International Nuclear Information System (INIS)

    Kamata, Shinya; Kubo, Kazuo

    2004-01-01

    USNRC published the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology to large break LOCA which supported the revised rule for Emergency Core Cooling System performance in 1989. In USNRC regulatory guide 1.157, it is required that the peak cladding temperature (PCT) cannot exceed 2200deg F with high probability 95th percentile. In recent years, overseas countries have developed statistical methodology and best estimate code with the model which can provide more realistic simulation for the phenomena based on the CSAU evaluation methodology. In order to calculate PCT probability distribution by Monte Carlo trials, there are approaches such as the response surface technique using polynomials, the order statistics method, etc. For the purpose of performing rational statistic analysis, Mitsubishi Heavy Industries, LTD (MHI) tried to develop the statistic LOCA method using the best estimate LOCA code MCOBRA/TRAC and the simplified code HOTSPOT. HOTSPOT is a Monte Carlo heat conduction solver to evaluate the uncertainties of the significant fuel parameters at the PCT positions of the hot rod. The direct uncertainty sensitivity studies can be performed without the response surface because the Monte Carlo simulation for key parameters can be performed in short time using HOTSPOT. With regard to the parameter uncertainties, MHI established the treatment that the bounding conditions are given for LOCA boundary and plant initial conditions, the Monte Carlo simulation using HOTSPOT is applied to the significant fuel parameters. The paper describes the large break LOCA evaluation method with the simplified statistic approach and the results of the application of the method to the representative four-loop nuclear power plant. (author)

  18. Realistic Simulation of Rice Plant

    Directory of Open Access Journals (Sweden)

    Wei-long DING

    2011-09-01

    Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.

  19. Coupling population dynamics with earth system models: the POPEM model.

    Science.gov (United States)

    Navarro, Andrés; Moreno, Raúl; Jiménez-Alcázar, Alfonso; Tapiador, Francisco J

    2017-09-16

    Precise modeling of CO 2 emissions is important for environmental research. This paper presents a new model of human population dynamics that can be embedded into ESMs (Earth System Models) to improve climate modeling. Through a system dynamics approach, we develop a cohort-component model that successfully simulates historical population dynamics with fine spatial resolution (about 1°×1°). The population projections are used to improve the estimates of CO 2 emissions, thus transcending the bulk approach of existing models and allowing more realistic non-linear effects to feature in the simulations. The module, dubbed POPEM (from Population Parameterization for Earth Models), is compared with current emission inventories and validated against UN aggregated data. Finally, it is shown that the module can be used to advance toward fully coupling the social and natural components of the Earth system, an emerging research path for environmental science and pollution research.

  20. Establishing statistical models of manufacturing parameters

    International Nuclear Information System (INIS)

    Senevat, J.; Pape, J.L.; Deshayes, J.F.

    1991-01-01

    This paper reports on the effect of pilgering and cold-work parameters on contractile strain ratio and mechanical properties that were investigated using a large population of Zircaloy tubes. Statistical models were established between: contractile strain ratio and tooling parameters, mechanical properties (tensile test, creep test) and cold-work parameters, and mechanical properties and stress-relieving temperature

  1. Active and realistic passive marijuana exposure tested by three immunoassays and GC/MS in urine

    International Nuclear Information System (INIS)

    Mule, S.J.; Lomax, P.; Gross, S.J.

    1988-01-01

    Human urine samples obtained before and after active and passive exposure to marijuana were analyzed by immune kits (Roche, Amersham, and Syva) and gas chromatography/mass spectrometry (GC/MS). Seven of eight subjects were positive for the entire five-day test period with one immune kit. The latter correlated with GC/MS in 98% of the samples. Passive inhalation experiments under conditions likely to reflect realistic exposure resulted consistently in less than 10 ng/mL of cannabinoids. The 10-100-ng/mL cannabinoid concentration range essential for detection of occasional and moderate marijuana users is thus unaffected by realistic passive inhalation

  2. Active and realistic passive marijuana exposure tested by three immunoassays and GC/MS in urine

    Energy Technology Data Exchange (ETDEWEB)

    Mule, S.J.; Lomax, P.; Gross, S.J.

    1988-05-01

    Human urine samples obtained before and after active and passive exposure to marijuana were analyzed by immune kits (Roche, Amersham, and Syva) and gas chromatography/mass spectrometry (GC/MS). Seven of eight subjects were positive for the entire five-day test period with one immune kit. The latter correlated with GC/MS in 98% of the samples. Passive inhalation experiments under conditions likely to reflect realistic exposure resulted consistently in less than 10 ng/mL of cannabinoids. The 10-100-ng/mL cannabinoid concentration range essential for detection of occasional and moderate marijuana users is thus unaffected by realistic passive inhalation.

  3. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    Science.gov (United States)

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  4. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993-2010.

    Science.gov (United States)

    Bjerregaard, Peter; Becker, Ulrik

    2013-01-01

    Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been published from circumpolar indigenous communities. The purpose of the study is to compare information on the consumption of tobacco and alcohol obtained from 3 population surveys in Greenland with import statistics. Estimates of consumption of cigarettes and alcohol using several different survey instruments in cross-sectional population studies from 1993-1994, 1999-2001 and 2005-2010 were compared with import statistics from the same years. For cigarettes, survey results accounted for virtually the total import. Alcohol consumption was significantly under-reported with reporting completeness ranging from 40% to 51% for different estimates of habitual weekly consumption in the 3 study periods. Including an estimate of binge drinking increased the estimated total consumption to 78% of the import. Compared with import statistics, questionnaire-based population surveys capture the consumption of cigarettes well in Greenland. Consumption of alcohol is under-reported, but asking about binge episodes in addition to the usual intake considerably increased the reported intake in this population and made it more in agreement with import statistics. It is unknown to what extent these findings at the population level can be inferred to population subgroups.

  5. The Statistical Fermi Paradox

    Science.gov (United States)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  6. Game statistics for the Island of Olkiluoto in 2011-2012

    Energy Technology Data Exchange (ETDEWEB)

    Niemi, M.; Nieminen, M. [Faunatica Oy, Espoo (Finland); Jussila, I. [Turku Univ. (Finland)

    2012-11-15

    The game statistics for the island of Olkiluoto were updated in the summer 2012 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2011-2012. The moose population has been decreasing slightly during the past ten years. The increasing lynx population has decreasing effect on small ungulate (white-tailed deer and roe deer) populations. The number of hunted mountain hares and European brown hares decreased when comparing the previous year. In addition, the number of hunted raccoon dogs was about 50 per cent lower than in the year 2010. Altogether 27 waterfowls were hunted in 2011. The population of mountain hare is abundant, despite that there were lynx living on the eastern part of island during the winter 2011. Based on track observations, there are pine martens living on the area as well. In addition, there were some observations of wolves visiting on the area. The winter 2011-2012 was milder than the previous one, and it seemed that young swans wintering on the area survived better that in the previous winter. (orig.)

  7. Game statistics for the Island of Olkiluoto in 2011-2012

    International Nuclear Information System (INIS)

    Niemi, M.; Nieminen, M.; Jussila, I.

    2012-11-01

    The game statistics for the island of Olkiluoto were updated in the summer 2012 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2011-2012. The moose population has been decreasing slightly during the past ten years. The increasing lynx population has decreasing effect on small ungulate (white-tailed deer and roe deer) populations. The number of hunted mountain hares and European brown hares decreased when comparing the previous year. In addition, the number of hunted raccoon dogs was about 50 per cent lower than in the year 2010. Altogether 27 waterfowls were hunted in 2011. The population of mountain hare is abundant, despite that there were lynx living on the eastern part of island during the winter 2011. Based on track observations, there are pine martens living on the area as well. In addition, there were some observations of wolves visiting on the area. The winter 2011-2012 was milder than the previous one, and it seemed that young swans wintering on the area survived better that in the previous winter. (orig.)

  8. Context problems in realistic mathematics education: A calculus course as an example

    NARCIS (Netherlands)

    Gravemeijer, K.P.E.; Doorman, L.M.

    1999-01-01

    This article discusses the role of context problems, as they are used in the Dutch approach that is known as realistic mathematics education (RME). In RME, context problems are intended for supporting a reinvention process that enables students to come to grips with formal mathematics. This approach

  9. Critical reflections on realist review: insights from customizing the methodology to the needs of participatory research assessment.

    Science.gov (United States)

    Jagosh, Justin; Pluye, Pierre; Wong, Geoff; Cargo, Margaret; Salsberg, Jon; Bush, Paula L; Herbert, Carol P; Green, Lawrence W; Greenhalgh, Trish; Macaulay, Ann C

    2014-06-01

    Realist review has increased in popularity as a methodology for complex intervention assessment. Our experience suggests that the process of designing a realist review requires its customization to areas under investigation. To elaborate on this idea, we first describe the logic underpinning realist review and then present critical reflections on our application experience, organized in seven areas. These are the following: (1) the challenge of identifying middle range theory; (2) addressing heterogeneity and lack of conceptual clarity; (3) the challenge of appraising the quality of complex evidence; (4) the relevance of capturing unintended outcomes; (5) understanding the process of context, mechanism, and outcome (CMO) configuring; (6) incorporating middle-range theory in the CMO configuration process; and (7) using middle range theory to advance the conceptualization of outcomes - both visible and seemingly 'hidden'. One conclusion from our experience is that the degree of heterogeneity of the evidence base will determine whether theory can drive the development of review protocols from the outset, or will follow only after an intense period of data immersion. We hope that presenting a critical reflection on customizing realist review will convey how the methodology can be tailored to the often complex and idiosyncratic features of health research, leading to innovative evidence syntheses. Copyright © 2013 John Wiley & Sons, Ltd.

  10. HackAttack: Game-Theoretic Analysis of Realistic Cyber Conflicts

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Brady, Andrew C [Jefferson Middle School, Oak Ridge, TN; Brady, Ethan J [Oak Ridge High School, Oak Ridge, TN; Ferragut, Jacob M [Oak Ridge High School, Oak Ridge, TN; Ferragut, Nathan M [Oak Ridge High School, Oak Ridge, TN; Wildgruber, Max C [ORNL

    2016-01-01

    Game theory is appropriate for studying cyber conflict because it allows for an intelligent and goal-driven adversary. Applications of game theory have led to a number of results regarding optimal attack and defense strategies. However, the overwhelming majority of applications explore overly simplistic games, often ones in which each participant s actions are visible to every other participant. These simplifications strip away the fundamental properties of real cyber conflicts: probabilistic alerting, hidden actions, unknown opponent capabilities. In this paper, we demonstrate that it is possible to analyze a more realistic game, one in which different resources have different weaknesses, players have different exploits, and moves occur in secrecy, but they can be detected. Certainly, more advanced and complex games are possible, but the game presented here is more realistic than any other game we know of in the scientific literature. While optimal strategies can be found for simpler games using calculus, case-by-case analysis, or, for stochastic games, Q-learning, our more complex game is more naturally analyzed using the same methods used to study other complex games, such as checkers and chess. We define a simple evaluation function and employ multi-step searches to create strategies. We show that such scenarios can be analyzed, and find that in cases of extreme uncertainty, it is often better to ignore one s opponent s possible moves. Furthermore, we show that a simple evaluation function in a complex game can lead to interesting and nuanced strategies.

  11. MONITORING OF GENETIC DIVERSITY IN FARMED DEER POPULATIONS USING MICROSATELLITE MARKERS

    Directory of Open Access Journals (Sweden)

    Pavol Bajzík

    2011-12-01

    Full Text Available Deer (Cervidaei belong to the most important species used as farmed animals. We focused on assesing the genetic diversity among five deer populations. Analysis has been performed on a total of 183 animals originating from Czech Republic, Hungary, New Zealand, Poland and Slovak Republic. Genetic variability were investigated using 8 microsatellite markers used in deer. Statistical data of all populations we obtained on the basis of Nei statistics, using by POWERMARKER 3.23 programme. Graphical view of relationships among populations and individuals in the populations was obtained using the Dendroscope software. Molecular genetic data combinated with evaluation in statistical programmes could lead to a complex view of populations and diffrences among them.doi:10.5219/172

  12. Radiative neutron capture: Hauser Feshbach vs. statistical resonances

    Energy Technology Data Exchange (ETDEWEB)

    Rochman, D., E-mail: dimitri-alexandre.rochman@psi.ch [Reactor Physics and Systems Behavior Laboratory, Paul Scherrer Institute, Villigen (Switzerland); Goriely, S. [Institut d' Astronomie et d' Astrophysique, CP-226, Université Libre de Bruxelles, 1050 Brussels (Belgium); Koning, A.J. [Nuclear Data Section, IAEA, Vienna (Austria); Uppsala University, Uppsala (Sweden); Ferroukhi, H. [Reactor Physics and Systems Behavior Laboratory, Paul Scherrer Institute, Villigen (Switzerland)

    2017-01-10

    The radiative neutron capture rates for isotopes of astrophysical interest are commonly calculated on the basis of the statistical Hauser Feshbach (HF) reaction model, leading to smooth and monotonically varying temperature-dependent Maxwellian-averaged cross sections (MACS). The HF approximation is known to be valid if the number of resonances in the compound system is relatively high. However, such a condition is hardly fulfilled for keV neutrons captured on light or exotic neutron-rich nuclei. For this reason, a different procedure is proposed here, based on the generation of statistical resonances. This novel technique, called the “High Fidelity Resonance” (HFR) method is shown to provide similar results as the HF approach for nuclei with a high level density but to deviate and be more realistic than HF predictions for light and neutron-rich nuclei or at relatively low sub-keV energies. The MACS derived with the HFR method are systematically compared with the traditional HF calculations for some 3300 neutron-rich nuclei and shown to give rise to significantly larger predictions with respect to the HF approach at energies of astrophysical relevance. For this reason, the HF approach should not be applied to light or neutron-rich nuclei. The Doppler broadening of the generated resonances is also studied and found to have a negligible impact on the calculated MACS.

  13. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  14. Statistical testing of association between menstruation and migraine.

    Science.gov (United States)

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  15. Collimator optimization in myocardial perfusion SPECT using the ideal observer and realistic background variability for lesion detection and joint detection and localization tasks

    Science.gov (United States)

    Ghaly, Michael; Du, Yong; Links, Jonathan M.; Frey, Eric C.

    2016-03-01

    In SPECT imaging, collimators are a major factor limiting image quality and largely determine the noise and resolution of SPECT images. In this paper, we seek the collimator with the optimal tradeoff between image noise and resolution with respect to performance on two tasks related to myocardial perfusion SPECT: perfusion defect detection and joint detection and localization. We used the Ideal Observer (IO) operating on realistic background-known-statistically (BKS) and signal-known-exactly (SKE) data. The areas under the receiver operating characteristic (ROC) and localization ROC (LROC) curves (AUCd, AUCd+l), respectively, were used as the figures of merit for both tasks. We used a previously developed population of 54 phantoms based on the eXtended Cardiac Torso Phantom (XCAT) that included variations in gender, body size, heart size and subcutaneous adipose tissue level. For each phantom, organ uptakes were varied randomly based on distributions observed in patient data. We simulated perfusion defects at six different locations with extents and severities of 10% and 25%, respectively, which represented challenging but clinically relevant defects. The extent and severity are, respectively, the perfusion defect’s fraction of the myocardial volume and reduction of uptake relative to the normal myocardium. Projection data were generated using an analytical projector that modeled attenuation, scatter, and collimator-detector response effects, a 9% energy resolution at 140 keV, and a 4 mm full-width at half maximum (FWHM) intrinsic spatial resolution. We investigated a family of eight parallel-hole collimators that spanned a large range of sensitivity-resolution tradeoffs. For each collimator and defect location, the IO test statistics were computed using a Markov Chain Monte Carlo (MCMC) method for an ensemble of 540 pairs of defect-present and -absent images that included the aforementioned anatomical and uptake variability. Sets of test statistics were

  16. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  17. StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks

    OpenAIRE

    Zhang, Han; Xu, Tao; Li, Hongsheng; Zhang, Shaoting; Wang, Xiaogang; Huang, Xiaolei; Metaxas, Dimitris

    2017-01-01

    Although Generative Adversarial Networks (GANs) have shown remarkable success in various tasks, they still face challenges in generating high quality images. In this paper, we propose Stacked Generative Adversarial Networks (StackGAN) aiming at generating high-resolution photo-realistic images. First, we propose a two-stage generative adversarial network architecture, StackGAN-v1, for text-to-image synthesis. The Stage-I GAN sketches the primitive shape and colors of the object based on given...

  18. Rational versus Emotional Reasoning in a Realistic Multi-Objective Environment

    OpenAIRE

    Mayboudi, Seyed Mohammad Hossein

    2011-01-01

    ABSTRACT: Emotional intelligence and its associated with models have recently become one of new active studies in the field of artificial intelligence. Several works have been performed on modelling of emotional behaviours such as love, hate, happiness and sadness. This study presents a comparative evaluation of rational and emotional behaviours and the effects of emotions on the decision making process of agents in a realistic multi-objective environment. NetLogo simulation environment is u...

  19. Capturing and reproducing realistic acoustic scenes for hearing research

    DEFF Research Database (Denmark)

    Marschall, Marton; Buchholz, Jörg

    Accurate spatial audio recordings are important for a range of applications, from the creation of realistic virtual sound environments to the evaluation of communication devices, such as hearing instruments and mobile phones. Spherical microphone arrays are particularly well-suited for capturing....... The properties of MOA microphone layouts and processing were investigated further by considering several order combinations. It was shown that the performance for horizontal vs. elevated sources can be adjusted by varying the order combination, but that a benefit of the higher horizontal orders can only be seen...

  20. On Small Antenna Measurements in a Realistic MIMO Scenario

    DEFF Research Database (Denmark)

    Yanakiev, Boyan; Nielsen, Jesper Ødum; Pedersen, Gert Frølund

    2010-01-01

    . The problem using coaxial cable is explained and a solution suitable for long distance channel sounding is presented. A large scale measurement campaign is then described. Special attention is paid to bring the measurement setup as close as possible to a realistic LTE network of the future, with attention......This paper deals with the challenges related to evaluating the performance of multiple, small terminal antennas within a natural MIMO environment. The focus is on the antenna measurement accuracy. First a method is presented for measuring small phone mock-ups, with the use of optical fibers...