WorldWideScience

Sample records for generating statistically realistic

  1. An iterative approach for generating statistically realistic populations of households

    CERN Document Server

    Gargiulo, Floriana; Huet, Sylvie; Deffuant, Guillaume

    2009-01-01

    Background: Many different simulation frameworks, in different topics, need to treat realistic datasets to initialize and calibrate the system. A precise reproduction of initial states is extremely important to obtain reliable forecast from the model. Methodology/Principal Findings: This paper proposes an algorithm to create an artificial population where individuals are described by their age, and are gathered in households respecting a variety of statistical constraints (distribution of household types, sizes, age of household head, difference of age between partners and among parents and children). Such a population is often the initial state of microsimulation or (agent) individual-based models. To get a realistic distribution of households is often very important, because this distribution has an impact on the demographic evolution. Usual techniques from microsimulation approach cross different sources of aggregated data for generating individuals. In our case the number of combinations of different hous...

  2. Simple and Realistic Data Generation

    DEFF Research Database (Denmark)

    Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico

    2006-01-01

    This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...

  3. Survey of Approaches to Generate Realistic Synthetic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.

  4. A method for generating realistic correlation matrices

    CERN Document Server

    Garcia, Stephan Ramon

    2011-01-01

    Simulating sample correlation matrices is important in many areas of statistics. Approaches such as generating normal data and finding their sample correlation matrix or generating random uniform $[-1,1]$ deviates as pairwise correlations both have drawbacks. We develop an algorithm for adding noise, in a highly controlled manner, to general correlation matrices. In many instances, our method yields results which are superior to those obtained by simply simulating normal data. Moreover, we demonstrate how our general algorithm can be tailored to a number of different correlation models. Finally, using our results with an existing clustering algorithm, we show that simulating correlation matrices can help assess statistical methodology.

  5. Generating realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2011-01-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.

  6. Realistic thermodynamic and statistical-mechanical measures for neural synchronization.

    Science.gov (United States)

    Kim, Sang-Yoon; Lim, Woochang

    2014-04-15

    Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.

  7. Generating Realistic Environments for Cyber Operations Development, Testing, and Training

    Science.gov (United States)

    2011-10-01

    capture comparator ................................................................. 11  Figure 9. Detailed view of social net behavioral analysis scoring...of social net behavioral analysis scoring 5.0 CONCLUSIONS AND RECOMMENDATIONS Truly realistic traffic generation can arguably only be achieved

  8. Generating Realistic Labelled, Weighted Random Graphs

    CERN Document Server

    Davis, Michael Charles; Liu, Weiru; Miller, Paul; Hunter, Ruth; Kee, Frank

    2015-01-01

    Generative algorithms for random graphs have yielded insights into the structure and evolution of real-world networks. Most networks exhibit a well-known set of properties, such as heavy-tailed degree distributions, clustering and community formation. Usually, random graph models consider only structural information, but many real-world networks also have labelled vertices and weighted edges. In this paper, we present a generative model for random graphs with discrete vertex labels and numeric edge weights. The weights are represented as a set of Beta Mixture Models (BMMs) with an arbitrary number of mixtures, which are learned from real-world networks. We propose a Bayesian Variational Inference (VI) approach, which yields an accurate estimation while keeping computation times tractable. We compare our approach to state-of-the-art random labelled graph generators and an earlier approach based on Gaussian Mixture Models (GMMs). Our results allow us to draw conclusions about the contribution of vertex labels a...

  9. Generating Realistic Labelled, Weighted Random Graphs

    Directory of Open Access Journals (Sweden)

    Michael Charles Davis

    2015-12-01

    Full Text Available Generative algorithms for random graphs have yielded insights into the structure and evolution of real-world networks. Most networks exhibit a well-known set of properties, such as heavy-tailed degree distributions, clustering and community formation. Usually, random graph models consider only structural information, but many real-world networks also have labelled vertices and weighted edges. In this paper, we present a generative model for random graphs with discrete vertex labels and numeric edge weights. The weights are represented as a set of Beta Mixture Models (BMMs with an arbitrary number of mixtures, which are learned from real-world networks. We propose a Bayesian Variational Inference (VI approach, which yields an accurate estimation while keeping computation times tractable. We compare our approach to state-of-the-art random labelled graph generators and an earlier approach based on Gaussian Mixture Models (GMMs. Our results allow us to draw conclusions about the contribution of vertex labels and edge weights to graph structure.

  10. Realistic facial animation generation based on facial expression mapping

    Science.gov (United States)

    Yu, Hui; Garrod, Oliver; Jack, Rachael; Schyns, Philippe

    2014-01-01

    Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being's sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.

  11. Generation of anatomically realistic numerical phantoms for optoacoustic breast imaging

    Science.gov (United States)

    Lou, Yang; Mitsuhashi, Kenji; Appleton, Catherine M.; Oraevsky, Alexander; Anastasio, Mark A.

    2016-03-01

    Because optoacoustic tomography (OAT) can provide functional information based on hemoglobin contrast, it is a promising imaging modality for breast cancer diagnosis. Developing an effective OAT breast imaging system requires balancing multiple design constraints, which can be expensive and time-consuming. Therefore, computer- simulation studies are often conducted to facilitate this task. However, most existing computer-simulation studies of OAT breast imaging employ simple phantoms such as spheres or cylinders that over-simplify the complex anatomical structures in breasts, thus limiting the value of these studies in guiding real-world system design. In this work, we propose a method to generate realistic numerical breast phantoms for OAT research based on clinical magnetic resonance imaging (MRI) data. The phantoms include a skin layer that defines breast-air boundary, major vessel branches that affect light absorption in the breast, and fatty tissue and fibroglandular tissue whose acoustical heterogeneity perturbs acoustic wave propagation. By assigning realistic optical and acoustic parameters to different tissue types, we establish both optic and acoustic breast phantoms, which will be exported into standard data formats for cross-platform usage.

  12. Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, B.; Burch, J.; Barker, G.

    2010-08-01

    The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.

  13. Realistic packed bed generation using small numbers of spheres

    Energy Technology Data Exchange (ETDEWEB)

    Pavlidis, D., E-mail: dimitrios.pavlidis04@imperial.ac.uk; Lathouwers, D.

    2013-10-15

    Highlights: • A method for generating 3D, periodic, closely packed beds of small numbers (<50) of spheres is presented. • The method is able to reproduce characteristics for the entirety (including nearwall area) of a randomly stacked bed. • Results are in good agreement with reference numerical data. -- Abstract: A method for stochastically generating three-dimensional, periodic, closely packed beds of small numbers (less than 50) of spheres is presented. This is an essential and integral part of realistic modelling of fluid flow and heat transfer through packed beds. In order to be able to reproduce the entirety of these complex geometries (in the radial direction) using small numbers of spheres, they are divided into two regions: the near-wall region (up to 4–5 sphere diameters from the solid wall in the wall-normal direction) and the core region. Near-wall stackings are doubly periodic and include a solid wall, while core stackings are triply periodic. A computational method for generating such geometries is presented for each region. Both are based on overlap removal methods. Results are compared against reference numerical data. Diagnostics used to evaluate the models include average packing fractions and coordination numbers, porosity profiles and distributions of the angle between two spheres which touch a common neighbour. Results are in good qualitative and quantitative agreement with the available reference data.

  14. Optimizing Wind And Hydropower Generation Within Realistic Reservoir Operating Policy

    Science.gov (United States)

    Magee, T. M.; Clement, M. A.; Zagona, E. A.

    2012-12-01

    Previous studies have evaluated the benefits of utilizing the flexibility of hydropower systems to balance the variability and uncertainty of wind generation. However, previous hydropower and wind coordination studies have simplified non-power constraints on reservoir systems. For example, some studies have only included hydropower constraints on minimum and maximum storage volumes and minimum and maximum plant discharges. The methodology presented here utilizes the pre-emptive linear goal programming optimization solver in RiverWare to model hydropower operations with a set of prioritized policy constraints and objectives based on realistic policies that govern the operation of actual hydropower systems, including licensing constraints, environmental constraints, water management and power objectives. This approach accounts for the fact that not all policy constraints are of equal importance. For example target environmental flow levels may not be satisfied if it would require violating license minimum or maximum storages (pool elevations), but environmental flow constraints will be satisfied before optimizing power generation. Additionally, this work not only models the economic value of energy from the combined hydropower and wind system, it also captures the economic value of ancillary services provided by the hydropower resources. It is recognized that the increased variability and uncertainty inherent with increased wind penetration levels requires an increase in ancillary services. In regions with liberalized markets for ancillary services, a significant portion of hydropower revenue can result from providing ancillary services. Thus, ancillary services should be accounted for when determining the total value of a hydropower system integrated with wind generation. This research shows that the end value of integrated hydropower and wind generation is dependent on a number of factors that can vary by location. Wind factors include wind penetration level

  15. Towards a realistic event generator for in-medium signals

    Energy Technology Data Exchange (ETDEWEB)

    Seck, Florian [TU Darmstadt (Germany); Collaboration: HADES-Collaboration

    2015-07-01

    The most important task of theoretical heavy-ion physics is to link experimental observables to the bulk properties and the microscopic structure of the different phases of strongly interacting matter. Until now the hadronic cocktails produced with the event generator Pluto for the HADES and CBM experiments only included a contribution from freeze-out ρ mesons modeled by a Breit-Wigner distribution around its pole mass. However as dileptons are radiated from the fireball during the whole time evolution, medium effects like the broadening of the ρ should also be included in the simulations. Calculations of the in-medium ρ spectral function by R. Rapp and J. Wambach demonstrate, that a large part of the in-medium ρ mesons feed into the mass region below the ρ/ω pole mass down to zero masses. The modular structure of Pluto makes it feasible to customize the event generator and incorporate models of in-medium physics, like the Rapp-Wambach spectral function, as plug-ins. For masses above 1 GeV/c{sup 2} we include emission due to multi-pion annihilation and due to QGP radiation. In this contribution first steps towards the implementation of such a plug-in into the event generator Pluto are presented.

  16. A Simplified Model for Generating 3D Realistic Sound in the Multimedia and Virtual Reality Systems

    Institute of Scientific and Technical Information of China (English)

    赵Yu; 何志均; 等

    1996-01-01

    It is a key feature to embed 3D realistic sound effect in the future multimedia and virtual reality systems.Recent research on acoustics and psychoacoustics reveals the important cues for sound localization and sound perception.One promising approach to generate 3D realistic sound effect uses two earphones by simulating the sound waveforms from sound source to eardrum.This paper summarizes two methods for generating 3D realistic sound and points out their inherent drawbacks.To overcome these drawbacks we propose a simplified model to generate 3D realistic sound at any positions in the horizontal plane based on the results of sound perception and localization.Experimental results show that the model is correct and efficient.

  17. Optimal working conditions for thermoelectric generators with realistic thermal coupling

    CERN Document Server

    Apertet, Y; Glavatskaya, O; Goupil, C; Lecoeur, P

    2011-01-01

    We study how maximum output power can be obtained from a thermoelectric generator(TEG) with nonideal heat exchangers. We demonstrate with an analytic approach based on a force-flux formalism that the sole improvement of the intrinsic characteristics of thermoelectric modules including the enhancement of the figure of merit is of limited interest: the constraints imposed by the working conditions of the TEG must be considered on the same footing. Introducing an effective thermal conductance we derive the conditions which permit maximization of both efficiency and power production of the TEG dissipatively coupled to heat reservoirs. Thermal impedance matching must be accounted for as well as electrical impedance matching in order to maximize the output power. Our calculations also show that the thermal impedance does not only depend on the thermal conductivity at zero electrical current: it also depends on the TEG figure of merit. Our analysis thus yields both electrical and thermal conditions permitting optima...

  18. Quantum versus classical foundation of statistical mechanics under experimentally realistic conditions.

    Science.gov (United States)

    Reimann, Peter; Evstigneev, Mykhaylo

    2013-11-01

    Focusing on isolated macroscopic systems, described in terms of either a quantum mechanical or a classical model, our two key questions are how far does an initial ensemble (usually far from equilibrium and largely unknown in detail) evolve towards a stationary long-time behavior (equilibration) and how far is this steady state in agreement with the microcanonical ensemble as predicted by statistical mechanics (thermalization). A recently developed quantum mechanical treatment of the problem is briefly summarized, putting particular emphasis on the realistic modeling of experimental measurements and nonequilibrium initial conditions. Within this framework, equilibration can be proven under very weak assumptions about those measurements and initial conditions, while thermalization still requires quite strong additional hypotheses. An analogous approach within the framework of classical mechanics is developed and compared with the quantum case. In particular, the assumptions to guarantee classical equilibration are now rather strong, while thermalization then follows under relatively weak additional conditions.

  19. Using Microsoft Excel to Generate Usage Statistics

    Science.gov (United States)

    Spellman, Rosemary

    2011-01-01

    At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…

  20. NETMORPH: a framework for the stochastic generation of large scale neuronal networks with realistic neuron morphologies

    NARCIS (Netherlands)

    Koene, R.A.; Tijms, B.; van Hees, P.; Postma, F.; de Ridder, A.; Ramakers, G.J.A.; van Pelt, J.; van Ooyen, A.

    2009-01-01

    We present a simulation framework, called NETMORPH, for the developmental generation of 3D large-scale neuronal networks with realistic neuron morphologies. In NETMORPH, neuronal morphogenesis is simulated from the perspective of the individual growth cone. For each growth cone in a growing axonal o

  1. Statistical analysis of next generation sequencing data

    CERN Document Server

    Nettleton, Dan

    2014-01-01

    Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...

  2. Three Generative, Lexicalised Models for Statistical Parsing

    CERN Document Server

    Collins, M

    1997-01-01

    In this paper we first propose a new statistical parsing model, which is a generative model of lexicalised context-free grammar. We then extend the model to include a probabilistic treatment of both subcategorisation and wh-movement. Results on Wall Street Journal text show that the parser performs at 88.1/87.5% constituent precision/recall, an average improvement of 2.3% over (Collins 96).

  3. Fractals Generated by Statistical Contraction Operators

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the theory of random fractal, there are two important classes of random sets, one is the class of fractals generated by the paths of stochastic processes and another one is the class of factals generated by statistical contraction operators. Now we will introduce some things about the probability basis and fractal properties of fractals in the last class. The probability basis contains (1) the convergence and measurability of a random recursive set K(ω) as a random element, (2) martingals property. The fractal properties include (3) the character of various similarity, (4) the separability property, (5) the support and zero-one law of distribution Pk=PK-1, (6) the Hausdorff dimension and Hausdorff exact measure function.

  4. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan;

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations......The wind speed represents the main exogenous signal applied to a Wind Energy Conversion System (WECS) and determines its behavior. The erratic variation of the wind speed, highly dependent on the given site and on the atmospheric conditions, makes the wind speed quite difficult to model. Moreover...

  5. Generation of anatomically realistic numerical phantoms for photoacoustic and ultrasonic breast imaging.

    Science.gov (United States)

    Lou, Yang; Zhou, Weimin; Matthews, Thomas P; Appleton, Catherine M; Anastasio, Mark A

    2017-04-01

    Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.

  6. Generation of anatomically realistic numerical phantoms for photoacoustic and ultrasonic breast imaging

    Science.gov (United States)

    Lou, Yang; Zhou, Weimin; Matthews, Thomas P.; Appleton, Catherine M.; Anastasio, Mark A.

    2017-04-01

    Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.

  7. TextGen:a realistic text data content generation method for modern storage system benchmarks

    Institute of Scientific and Technical Information of China (English)

    Long-xiang WANG; Xiao-she DONG; Xing-jun ZHANG; Yin-feng WANG; Tao JU; Guo-fu FENG

    2016-01-01

    Modern storage systems incorporate data compressors to improve their performance and capacity. As a result, data content can significantly influence the result of a storage system benchmark. Because real-world proprietary datasets are too large to be copied onto a test storage system, and most data cannot be shared due to privacy issues, a benchmark needs to generate data synthetically. To ensure that the result is accurate, it is necessary to generate data content based on the characterization of real-world data properties that influence the storage system performance during the execution of a benchmark. The existing approach, called SDGen, cannot guarantee that the benchmark result is accurate in storage systems that have built-in word-based compressors. The reason is that SDGen characterizes the properties that influence compression performance only at the byte level, and no properties are characterized at the word level. To address this problem, we present TextGen, a realistic text data content generation method for modern storage system benchmarks. TextGen builds the word corpus by segmenting real-world text datasets, and creates a word-frequency distribution by counting each word in the corpus. To improve data generation performance, the word-frequency distribution is fitted to a lognormal distribution by maximum likelihood estimation. The Monte Carlo approach is used to generate synthetic data. The running time of TextGen generation depends only on the expected data size, which means that the time complexity of TextGen isO(n). To evaluate TextGen, four real-world datasets were used to perform an experiment. The experimental results show that, compared with SDGen, the compression performance and compression ratio of the datasets generated by TextGen deviate less from real-world datasets when end-tagged dense code, a representative of word-based compressors, is evaluated.

  8. The 4-D descent trajectory generation techniques under realistic operating conditions

    Science.gov (United States)

    Williams, David H.; Knox, Charles E.

    1990-01-01

    NASA-Langley has been conducting and sponsoring research in airborne energy management for a number of years. During the course of this research, two fundamental techniques for the generation of 4D (fixed time) descent trajectories have emerged as viable candidates for advanced flight management systems. The first technique utilizes speed schedules of constant Mach number transitioning to constant calibrated airspeed chosen empirically to produce minimum fuel usage. The second technique computes cost optimized speed schedules of variable airspeed developed through application of optimal control theory. Both techniques have been found to produce reasonable and flyable descent trajectories. The formulation of the algorithms for each technique is evaluated and their suitability for operations in realistic conditions is discussed. Operational factors considered include: airplace speed, thrust, and altitude rate constaints; wind, temperature, and pressure variations; Air Traffic Control altitude, speed, and time constaints; and pilot interface and guidance considerations. Time flexibility, fuel usage, and airborne computational requirements were the primary performance measures.

  9. A GIS-based simulation architecture and prototype for realistic spectral scene generation of vegetated areas

    Science.gov (United States)

    Fink, Christopher E.; Moulton, Joseph R., Jr.; Ortalano, Michael; Helmsen, John; Soiguine, Alexander; Kaplan, Raymond; Seng, William; Haren, Raymond E.

    2004-08-01

    Vehicles concealed in highly cluttered, vegetated scene environments pose significant challenges for passive sensor systems and algorithms. System analysts working hyperspectral exploitation research require an at-aperture simulation capability that allows them to reliably investigate beyond the highly-limited scenarios that expensive field data sets afford. To be useful to the analyst, such a simulation should address the following requirements: (1) the ability to easily generate scene representations for arbitrary Earth regions of tactical interest; (2) the ability to represent scene components, like terrain, trees and bushes, to an extremely high spatial resolution for calculation of accurate multiple spectral reflections, occlusions and shadowing; (3) the ability to stimulate the 3D scene with realistic natural spectral irradiances for arbitrary 3D model atmospheres; (4) the ability to appropriately integrate constantly improving, rigorous thermal, spectral signature and atmospheric propagation models; (5) the ability to efficiently render at-aperture hyperspectral data sets in a reasonable run-time. Herein the authors describe their work toward a comprehensive ray-tracer-based simulation architecture and prototype capability that addresses these requirements. They describe their development of a GIS-based toolset for database generation, tools for 3D vegetated terrain-model development, and a prototype raytracer-based spectral scene generator.

  10. Generalized Warburg impedance on realistic self-affine fractals: Comparative study of statistically corrugated and isotropic roughness

    Indian Academy of Sciences (India)

    Rajesh Kumar; Rama Kant

    2009-09-01

    We analyse the problem of impedance for a diffusion controlled charge transfer process across an irregular interface. These interfacial irregularities are characterized as two class of random fractals: (i) a statistically isotropic self-affine fractals and (ii) a statistically corrugated self-affine fractals. The information about the realistic fractal surface roughness has been introduced through the bandlimited power-law power spectrum over limited wave numbers. The details of power spectrum of such roughness can be characterized in term of four fractal morphological parameters, viz. fractal dimension (), lower (ℓ), and upper () cut-off length scales of fractality, and the proportionality factor () of power spectrum. Theoretical results are analysed for the impedance of such rough electrode as well as the effect of statistical symmetries of roughness. Impedance response for irregular interface is simplified through expansion over intermediate frequencies. This intermediate frequency expansion with sufficient number of terms offers a good approximation over all frequency regimes. The Nyquist plots of impedance show the strong dependency mainly on three surface morphological parameters i.e. , ℓ and . We can say that our theoretical results also provide an alternative explanation for the exponent in intermediate frequency power-law form.

  11. Realistic three-generation models from SO(32) heterotic string theory

    CERN Document Server

    Abe, Hiroyuki; Otsuka, Hajime; Takano, Yasufumi

    2015-01-01

    We search for realistic supersymmetric standard-like models from SO(32) heterotic string theory on factorizable tori with multiple magnetic fluxes. Three chiral ganerations of quarks and leptons are derived from the adjoint and vector representations of SO(12) gauge groups embedded in SO(32) adjoint representation. Massless spectra of our models also include Higgs fields, which have desired Yukawa couplings to quarks and leptons at the tree-level.

  12. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    Science.gov (United States)

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  13. Statistically anisotropic curvature perturbation generated during the waterfall

    CERN Document Server

    Lyth, David H

    2012-01-01

    If the waterfall field of hybrid inflation couples to a U(1) gauge field, the waterfall can generate a statistically anisotropic contribution to the curvature perturbation. We investigate this possibility, generalising in several directions the seminal work of Yokoyama and Soda. The statistical anisotropy of the bispectrum could be detectable by PLANCK even if the statistical anisotropy of the spectrum is too small to detect.

  14. Is realistic neuronal modeling realistic?

    Science.gov (United States)

    Almog, Mara; Korngreen, Alon

    2016-11-01

    Scientific models are abstractions that aim to explain natural phenomena. A successful model shows how a complex phenomenon arises from relatively simple principles while preserving major physical or biological rules and predicting novel experiments. A model should not be a facsimile of reality; it is an aid for understanding it. Contrary to this basic premise, with the 21st century has come a surge in computational efforts to model biological processes in great detail. Here we discuss the oxymoronic, realistic modeling of single neurons. This rapidly advancing field is driven by the discovery that some neurons don't merely sum their inputs and fire if the sum exceeds some threshold. Thus researchers have asked what are the computational abilities of single neurons and attempted to give answers using realistic models. We briefly review the state of the art of compartmental modeling highlighting recent progress and intrinsic flaws. We then attempt to address two fundamental questions. Practically, can we realistically model single neurons? Philosophically, should we realistically model single neurons? We use layer 5 neocortical pyramidal neurons as a test case to examine these issues. We subject three publically available models of layer 5 pyramidal neurons to three simple computational challenges. Based on their performance and a partial survey of published models, we conclude that current compartmental models are ad hoc, unrealistic models functioning poorly once they are stretched beyond the specific problems for which they were designed. We then attempt to plot possible paths for generating realistic single neuron models. Copyright © 2016 the American Physiological Society.

  15. A Framework for the Generation of Realistic Synthetic Cardiac Ultrasound and Magnetic Resonance Imaging Sequences from the same Virtual Patients.

    Science.gov (United States)

    Zhou, Yitian; Giffard-Roisin, Sophie; De Craene, Mathieu; Camarasu-Pop, Sorina; D'hooge, Jan; Alessandrini, Martino; Friboulet, Denis; Sermesant, Maxime; Bernard, Olivier

    2017-05-25

    The use of synthetic sequences is one of the most promising tools for advanced in silico evaluation of the quantification of cardiac deformation and strain through 3D ultrasound (US) and magnetic resonance (MR) imaging. In this paper, we propose the first simulation framework which allows the generation of realistic 3D synthetic cardiac US and MR (both cine and tagging) image sequences from the same virtual patient. A state-of-the-art electromechanical (E/M) model was exploited for simulating groundtruth cardiac motion fields ranging from healthy to various pathological cases including both ventricular dyssynchrony and myocardial ischemia. The E/M groundtruth along with template MR/US images and physical simulators were combined in a unified framework for generating synthetic data. We efficiently merged several warping strategies to keep full control of myocardial deformations while preserving realistic image texture. In total, we generated 18 virtual patients, each with synthetic 3D US, cine MR and tagged MR sequences. The simulated images were evaluated both qualitatively by showing realistic textures and quantitatively by observing myocardial intensity distributions similar to real data. In particular, the US simulation showed a smoother myocardium/background interface than the state-of-the-art. We also assessed the mechanical properties. The pathological subjects were discriminated from the healthy ones by both global indexes (ejection fraction and the global circumferential strain) and regional strain curves. The synthetic database is comprehensive in terms of both pathology and modality, and has a level of realism sufficient for validation purposes. All the 90 sequences are made publicly available to the research community via an open-access database.

  16. Generating Geospatially Realistic Driving Patterns Derived From Clustering Analysis Of Real EV Driving Data

    DEFF Research Database (Denmark)

    Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob

    2014-01-01

    scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...

  17. Generation of realistic tsunami waves using a bottom-tilting wave maker

    Science.gov (United States)

    Park, Yong Sung; Hwang, Jin Hwan

    2016-11-01

    Tsunamis have caused more than 260,000 human losses and 250 billion in damage worldwide in the last ten years. Observations made during 2011 Japan Tohoku Tsunami revealed that the commonly used waves (solitary waves) to model tsunamis are at least an order-of-magnitude shorter than the real tsunamis, which calls for re-evaluation of the current understanding of tsunamis. To prompt the required paradigm shift, a new wave generator, namely the bottom-tilting wave generator, has been developed at the University of Dundee. The wave tank is fitted with an adjustable slope and a bottom flap hinged at the beginning of the slope. By moving the bottom flap up and down, we can generate very long waves. Here we will report characteristics of waves generated by simple bottom motions, either moving it upward or downward from an initial displacement ending it being horizontal. Two parameters, namely the initial displacement of the bottom and the speed of the motion, determine characteristics of the generated waves. Wave amplitudes scale well with the volume flux of the displaced water. On the other hand, due to combined effects of nonlinearity and dispersion, wavelengths show more complicated relationship with the two bottom motion parameters. We will also demonstrate that by combining simple up and down motions, it is possible to generate waves resembling the one measured during 2011 tsunami. YSP acknowledges financial support from the Royal Society of Edinburgh through the Royal Society of Edinburgh and Scottish Government Personal Research Fellowship Co-Funded by the Marie-Curie Actions.

  18. A generic approach to generating optimal controlled prespective route guidance in realistic traffic networks

    NARCIS (Netherlands)

    Chen, Y.; Zuurbier, F.S.; Zuylen, H.J. van; Hoogendoorn, S.P.

    2006-01-01

    This paper presents a generic methodology to generate optimal controlled dynamic prescriptive route guidance to be disseminated by means of variable message signs (VMS). The methodology is generic in the sense it can be used on any network topology and network model, with any number of VMS’s, for di

  19. Large-scale, realistic laboratory modeling of M2 internal tide generation at the Luzon Strait

    CERN Document Server

    Mercier, Matthieu J; Helfrich, Karl; Sommeria, Joël; Viboud, Samuel; Didelle, Henri; Saidi, Sasan; Dauxois, Thierry; Peacock, Thomas

    2015-01-01

    The complex double-ridge system in the Luzon Strait in the South China Sea (SCS) is one of the strongest sources of internal tides in the oceans, associated with which are some of the largest amplitude internal solitary waves on record. An issue of debate, however, has been the specific nature of their generation mechanism. To provide insight, we present the results of a large-scale laboratory experiment performed at the Coriolis platform. The experiment was carefully designed so that the relevant dimensionless parameters, which include the excursion parameter, criticality, Rossby, and Froude numbers, closely matched the ocean scenario. The results advocate that a broad and coherent weakly nonlinear, three-dimensional, M2 internal tide that is shaped by the overall geometry of the double-ridge system is radiated into the South China Sea and subsequently steepens, as opposed to being generated by a particular feature or localized region within the ridge system.

  20. Remembrance of phases past: An autoregressive method for generating realistic atmospheres in simulations

    Science.gov (United States)

    Srinath, Srikar; Poyneer, Lisa A.; Rudy, Alexander R.; Ammons, S. M.

    2014-08-01

    The advent of expensive, large-aperture telescopes and complex adaptive optics (AO) systems has strengthened the need for detailed simulation of such systems from the top of the atmosphere to control algorithms. The credibility of any simulation is underpinned by the quality of the atmosphere model used for introducing phase variations into the incident photons. Hitherto, simulations which incorporate wind layers have relied upon phase screen generation methods that tax the computation and memory capacities of the platforms on which they run. This places limits on parameters of a simulation, such as exposure time or resolution, thus compromising its utility. As aperture sizes and fields of view increase the problem will only get worse. We present an autoregressive method for evolving atmospheric phase that is efficient in its use of computation resources and allows for variability in the power contained in frozen flow or stochastic components of the atmosphere. Users have the flexibility of generating atmosphere datacubes in advance of runs where memory constraints allow to save on computation time or of computing the phase at each time step for long exposure times. Preliminary tests of model atmospheres generated using this method show power spectral density and rms phase in accordance with established metrics for Kolmogorov models.

  1. Intensive statistical complexity measure of pseudorandom number generators

    Science.gov (United States)

    Larrondo, H. A.; González, C. M.; Martín, M. T.; Plastino, A.; Rosso, O. A.

    2005-10-01

    A Statistical Complexity measure has been recently proposed to quantify the performance of chaotic Pseudorandom number generators (PRNG) (Physica A 354 (2005) 281). Here we revisit this quantifier and introduce two important improvements: (i) consideration of an intensive statistical complexity (Physica A 334 (2004) 119), and (ii) following the prescription of Brand and Pompe (Phys. Rev. Lett. 88 (2002) 174102-1) in evaluating the probability distribution associated with the PRNG. The ensuing new measure is applied to a very well-tested PRNG advanced by Marsaglia.

  2. Mathematical-statistical models of generated hazardous hospital solid waste.

    Science.gov (United States)

    Awad, A R; Obeidat, M; Al-Shareef, M

    2004-01-01

    This research work was carried out under the assumption that wastes generated from hospitals in Irbid, Jordan were hazardous. The hazardous and non-hazardous wastes generated from the different divisions in the three hospitals under consideration were not separated during collection process. Three hospitals, Princess Basma hospital (public), Princess Bade'ah hospital (teaching), and Ibn Al-Nafis hospital (private) in Irbid were selected for this study. The research work took into account the amounts of solid waste accumulated from each division and also determined the total amount generated from each hospital. The generation rates were determined (kilogram per patient, per day; kilogram per bed, per day) for the three hospitals. These generation rates were compared with similar hospitals in Europe. The evaluation suggested that the current situation regarding the management of these wastes in the three studied hospitals needs revision as these hospitals do not follow methods of waste disposals that would reduce risk to human health and the environment practiced in developed countries. Statistical analysis was carried out to develop models for the prediction of the quantity of waste generated at each hospital (public, teaching, private). In these models number of patients, beds, and type of hospital were revealed to be significant factors on quantity of waste generated. Multiple regressions were also used to estimate the quantities of wastes generated from similar divisions in the three hospitals (surgery, internal diseases, and maternity).

  3. Spatial Scan Statistic: Selecting clusters and generating elliptic clusters

    DEFF Research Database (Denmark)

    Christiansen, Lasse Engbo; Andersen, Jens Strodl

    2004-01-01

    The spatial scan statistic is widely used to search for clusters. This paper shows that the usually applied elimination of overlapping clusters to find secondary clusters is sensitive to smooth changes in the shape of the clusters. We present an algorithm for generation of set of confocal elliptic...... clusters. In addition, we propose a new way to present the information in a given set of clusters based on the significance of the clusters....

  4. Generation of realistic virtual nodules based on three-dimensional spatial resolution in lung computed tomography: A pilot phantom study.

    Science.gov (United States)

    Narita, Akihiro; Ohkubo, Masaki; Murao, Kohei; Matsumoto, Toru; Wada, Shinichi

    2017-08-04

    The aim of this feasibility study using phantoms was to propose a novel method for obtaining computer-generated realistic virtual nodules in lung computed tomography (CT). In the proposed methodology, pulmonary nodule images obtained with a CT scanner are deconvolved with the point spread function (PSF) in the scan plane and slice sensitivity profile (SSP) measured for the scanner; the resultant images are referred to as nodule-like object functions. Next, by convolving the nodule-like object function with the PSF and SSP of another (target) scanner, the virtual nodule can be generated so that it has the characteristics of the spatial resolution of the target scanner. To validate the methodology, the authors applied physical nodules of 5-, 7- and 10-mm-diameter (uniform spheres) included in a commercial CT test phantom. The nodule-like object functions were calculated from the sphere images obtained with two scanners (Scanner A and Scanner B); these functions were referred to as nodule-like object functions A and B, respectively. From these, virtual nodules were generated based on the spatial resolution of another scanner (Scanner C). By investigating the agreement of the virtual nodules generated from the nodule-like object functions A and B, the equivalence of the nodule-like object functions obtained from different scanners could be assessed. In addition, these virtual nodules were compared with the real (true) sphere images obtained with Scanner C. As a practical validation, five types of laboratory-made physical nodules with various complicated shapes and heterogeneous densities, similar to real lesions, were used. The nodule-like object functions were calculated from the images of these laboratory-made nodules obtained with Scanner A. From them, virtual nodules were generated based on the spatial resolution of Scanner C and compared with the real images of laboratory-made nodules obtained with Scanner C. Good agreement of the virtual nodules generated from

  5. Travel for the 2004 American Statistical Association Biannual Radiation Meeting: "Radiation in Realistic Environments: Interactions Between Radiation and Other Factors

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, David J.

    2009-07-21

    The 16th ASA Conference on Radiation and Health, held June 27-30, 2004 in Beaver Creek, CO, offered a unique forum for discussing research related to the effects of radiation exposures on human health in a multidisciplinary setting. The Conference furnishes investigators in health related disciplines the opportunity to learn about new quantitative approaches to their problems and furnishes statisticians the opportunity to learn about new applications for their discipline. The Conference was attended by about 60 scientists including statisticians, epidemiologists, biologists and physicists interested in radiation research. For the first time, ten recipients of Young Investigator Awards participated in the conference. The Conference began with a debate on the question: “Do radiation doses below 1 cGy increase cancer risks?” The keynote speaker was Dr. Martin Lavin, who gave a banquet presentation on the timely topic “How important is ATM?” The focus of the 2004 Conference on Radiation and Health was Radiation in Realistic Environments: Interactions Between Radiation and Other Risk Modifiers. The sessions of the conference included: Radiation, Smoking, and Lung Cancer Interactions of Radiation with Genetic Factors: ATM Radiation, Genetics, and Epigenetics Radiotherapeutic Interactions The Conference on Radiation and Health is held bi-annually, and participants are looking forward to the 17th conference to be held in 2006.

  6. Quantum Statistical Testing of a Quantum Random Number Generator

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL

    2014-01-01

    The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.

  7. Quantum Statistical Testing of a Quantum Random Number Generator

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL

    2014-01-01

    The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.

  8. Quantum statistical testing of a quantum random number generator

    Science.gov (United States)

    Humble, Travis S.

    2014-10-01

    The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the operation of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.

  9. Support system for generating statistics of existing railway rolling stock

    Science.gov (United States)

    Wróbel, A.; Płaczek, M.; Buchacz, A.

    2016-11-01

    Rail transport is a very important element of the transport system in European countries. Assumptions of rail transport has become more and more attractive in both passenger and freight transport. An example of this may be just noisy action "lorries on track", the promotion of intermodal transport. The principal factors affecting the condition of railway traffic safety are: the technical condition of the railway infrastructure, the technical condition of the rolling stock, the operation of crossings. The main purpose of this article is to present a tool to assist generate statistics operation railway wagons. As part of the project telemetric module allows the location of the freight wagon and the state of its load. Computer program supports GPS module will enable the location of a wagon with an accuracy of 1 meter, generate graphs of speed, road grade and load freight wagon.sheet".

  10. Generation of dense statistical connectomes from sparse morphological data

    Directory of Open Access Journals (Sweden)

    Robert eEgger

    2014-11-01

    Full Text Available Sensory-evoked signal flow, at cellular and network levels, is primarily determined by the synaptic wiring of the underlying neuronal circuitry. Measurements of synaptic innervation, connection probabilities and subcellular organization of synaptic inputs are thus among the most active fields of research in contemporary neuroscience. Methods to measure these quantities range from electrophysiological recordings over reconstructions of dendrite-axon overlap at light-microscopic levels to dense circuit reconstructions of small volumes at electron-microscopic resolution. However, quantitative and complete measurements at subcellular resolution and mesoscopic scales to obtain all local and long-range synaptic in/outputs for any neuron within an entire brain region are beyond present methodological limits. Here, we present a novel concept, implemented within an interactive software environment called NeuroNet, which allows (i integration of sparsely sampled (subcellular morphological data into an accurate anatomical reference frame of the brain region(s of interest, (ii up-scaling to generate an average dense model of the neuronal circuitry within the respective brain region(s and (iii statistical measurements of synaptic innervation between all neurons within the model. We illustrate our approach by generating a dense average model of the entire rat vibrissal cortex, providing the required anatomical data, and illustrate how to measure synaptic innervation statistically. Comparing our results with data from paired recordings in vitro and in vivo, as well as with reconstructions of synaptic contact sites at light- and electron-microscopic levels, we find that our in silico measurements are in line with previous results.

  11. SU-E-J-126: Generation of Fluoroscopic 3D Images Using Single X-Ray Projections on Realistic Modified XCAT Phantom Data.

    Science.gov (United States)

    Mishra, P; Li, R; St James, S; Yue, Y; Mak, R; Berbeco, R; Lewis, J

    2012-06-01

    To simulate the process of generating fluoroscopic 3D treatment images from 4DCT and measured 2D x-ray projections using a realistic modified XCAT phantom based on measured patient 3D tumor trajectories. First, the existing XCAT phantom is adapted to incorporate measured patient lung tumor trajectories. Realistic diaphragm and chest wall motion are automatically generated based on input tumor motion and position, producing synchronized, realistic motion in the phantom. Based on 4DCT generated with the XCAT phantom, we derive patient-specific motion models that are used to generate 3D fluoroscopic images. Patient-specific models are created in two steps: first, the displacement vector fields (DVFs) are obtained through deformable image registration of each phase of 4DCT with respect to a reference image (typically peak-exhale). Each phase is registered to the reference image to obtain (n-1) DVFs. Second, the most salient characteristics in the DVFs are captured in a compact representation through principal component analysis (PCA). Since PCA is a linear decomposition method, all the DVFs can be represented as linear combinations of eigenvectors. Fluoroscopic 3D images are obtained using the projection image to determine optimal weights for the eigenvectors. These weights are determined through iterative optimization of a cost function relating the projection image to the 3D image via the PCA lung motion model and a projection operator. Constructing fluoroscopic 3D images is thus reduced to finding optimal weights for the eigenvectors. Fluoroscopic 3D treatment images were generated using the modified XCAT phantom. The average relative error of the reconstructed image over 30 sec is 0.0457 HU and the standard deviation is 0.0063. The XCAT phantom was modified to produce realistic images by incorporating patient tumor trajectories. The modified XCAT phantom can be used to simulate the process of generating fluoroscopic 3D treatment images from 4DCT and 2D x

  12. Anomalous statistics of aftershock sequences generated by supershear ruptures

    Directory of Open Access Journals (Sweden)

    Pathikrit Bhattacharya

    2012-05-01

    Full Text Available Most earthquake ruptures propagate with speeds smaller than the Rayleigh wave velocity of the medium. These are called sub- Rayleigh ruptures. However, under suitable conditions, segments of otherwise sub- Rayleigh seismogenic ruptures can occasionally accelerate to speeds higher than the local shear wave velocity, giving rise to so-called supershear ruptures. The occurrence of supershear ruptures is usually associated with a locally higher value of pre-stress on the fault segment compared to the sub-Rayleigh segments of the same fault. Additionally, shear stress changes generated by the supershear rupture are radiated out unattenuated to distances comparable to the depth of rupture instead of rapidly decaying at much smaller distances from the rupture. This leads to aftershocks being distributed away from the fault on the supershear segment. This study attempts to verify whether these pre- and postseismic stress conditions and the resultant spatial aftershock distributions lead to discernible features in the statistical properties of the aftershock sequences of the earthquakes known to be associated with supershear ruptures. We analyze the Gutenberg-Richter scaling, the modified Omori law and Båth’s law for the aftershock sequences of two supershear mainshocks: the 1979 MW 6.5 Imperial Valley (California and 2002 MW 7.9 Denali (Alaska earthquakes. We observe that the b-value is always higher in the supershear zone than the rest of the sequence. We also observe that there is no systematic trend in the exponent of the modified Omori law when comparing the aftershocks in the supershear zone with the rest of the aftershocks. We argue that the b-value anomaly can be explained in terms of the off-fault distribution of aftershocks around the supershear segment of the rupture.

  13. Computation of a leakage in a stream generator heating tube with realistic initial and boundary conditions; Berechnung eines Dampferzeugerheizrohrlecks mit realistischen Anfangs- und Randbedingungen

    Energy Technology Data Exchange (ETDEWEB)

    Sarkadi, Peter; Schaffrath, Andreas [TUeV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2009-07-01

    In the frame of a PWR reactor safety analysis the TUEV Nord Sys Tec GmbH has analyzed the plant behavior in case of a steam generator tube leakage using the thermal hydraulic code ATHLET and realistic initial and boundary conditions. The aim of the analysis was to show that the response of the emergency cooling criteria including the activation of safety injection pumps can be avoided. The Activation of the safety injection pumps could jeopardize the activity retention.

  14. Physical and statistical models for steam generator clogging diagnosis

    CERN Document Server

    Girard, Sylvain

    2014-01-01

    Clogging of steam generators in nuclear power plants is a highly sensitive issue in terms of performance and safety and this book proposes a completely novel methodology for diagnosing this phenomenon. It demonstrates real-life industrial applications of this approach to French steam generators and applies the approach to operational data gathered from French nuclear power plants. The book presents a detailed review of in situ diagnosis techniques and assesses existing methodologies for clogging diagnosis, whilst examining their limitations. It also addresses numerical modelling of the dynamic

  15. Improving Statistical Language Model Performance with Automatically Generated Word Hierarchies

    CERN Document Server

    McMahon, J; Mahon, John Mc

    1995-01-01

    An automatic word classification system has been designed which processes word unigram and bigram frequency statistics extracted from a corpus of natural language utterances. The system implements a binary top-down form of word clustering which employs an average class mutual information metric. Resulting classifications are hierarchical, allowing variable class granularity. Words are represented as structural tags --- unique $n$-bit numbers the most significant bit-patterns of which incorporate class information. Access to a structural tag immediately provides access to all classification levels for the corresponding word. The classification system has successfully revealed some of the structure of English, from the phonemic to the semantic level. The system has been compared --- directly and indirectly --- with other recent word classification systems. Class based interpolated language models have been constructed to exploit the extra information supplied by the classifications and some experiments have sho...

  16. Quantum statistics of Raman scattering model with Stokes mode generation

    Science.gov (United States)

    Tanatar, Bilal; Shumovsky, Alexander S.

    1994-01-01

    The model describing three coupled quantum oscillators with decay of Rayleigh mode into the Stokes and vibration (phonon) modes is examined. Due to the Manley-Rowe relations the problem of exact eigenvalues and eigenstates is reduced to the calculation of new orthogonal polynomials defined both by the difference and differential equations. The quantum statistical properties are examined in the case when initially: the Stokes mode is in the vacuum state; the Rayleigh mode is in the number state; and the vibration mode is in the number of or squeezed states. The collapses and revivals are obtained for different initial conditions as well as the change in time the sub-Poisson distribution by the super-Poisson distribution and vice versa.

  17. Statistical evaluation of PACSTAT random number generation capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, G.F.; Toland, M.R.; Harty, H.; Budden, M.J.; Bartley, C.L.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT were implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.

  18. Noise, transient dynamics, and the generation of realistic interspike interval variation in square-wave burster neurons

    Science.gov (United States)

    Marin, Bóris; Pinto, Reynaldo Daniel; Elson, Robert C.; Colli, Eduardo

    2014-10-01

    First return maps of interspike intervals for biological neurons that generate repetitive bursts of impulses can display stereotyped structures (neuronal signatures). Such structures have been linked to the possibility of multicoding and multifunctionality in neural networks that produce and control rhythmical motor patterns. In some cases, isolating the neurons from their synaptic network reveals irregular, complex signatures that have been regarded as evidence of intrinsic, chaotic behavior. We show that incorporation of dynamical noise into minimal neuron models of square-wave bursting (either conductance-based or abstract) produces signatures akin to those observed in biological examples, without the need for fine tuning of parameters or ad hoc constructions for inducing chaotic activity. The form of the stochastic term is not strongly constrained and can approximate several possible sources of noise, e.g., random channel gating or synaptic bombardment. The cornerstone of this signature generation mechanism is the rich, transient, but deterministic dynamics inherent in the square-wave (saddle-node and homoclinic) mode of neuronal bursting. We show that noise causes the dynamics to populate a complex transient scaffolding or skeleton in state space, even for models that (without added noise) generate only periodic activity (whether in bursting or tonic spiking mode).

  19. The Comparability of the Statistical Characteristics of Test Items Generated by Computer Algorithms.

    Science.gov (United States)

    Meisner, Richard; And Others

    This paper presents a study on the generation of mathematics test items using algorithmic methods. The history of this approach is briefly reviewed and is followed by a survey of the research to date on the statistical parallelism of algorithmically generated mathematics items. Results are presented for 8 parallel test forms generated using 16…

  20. The need to generate realistic strain signals at an automotive coil spring for durability simulation leading to fatigue life assessment

    Science.gov (United States)

    Putra, T. E.; Abdullah, S.; Schramm, D.; Nuawi, M. Z.; Bruckmann, T.

    2017-09-01

    This study aims to accelerate fatigue tests using simulated strain signals. Strain signals were acquired from a coil spring involving car movements. Using a mathematical expression, the measured strain signals yielded acceleration signals, and were considered as disturbances on generating strain signals. The simulated strain signals gave the testing time deviation by only 1.5%. The wavelet-based data editing was applied to shorten the strain signals time up to 36.7% and reduced the testing time up to 33.9%. In conclusion, the simulated strain signals were able to maintain the majority of fatigue damage and decreased the testing time.

  1. Higher-Order Moment Characterisation of Rogue Wave Statistics in Supercontinuum Generation

    DEFF Research Database (Denmark)

    Sørensen, Simon Toft; Bang, Ole; Wetzel, Benjamin

    2012-01-01

    The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour.......The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour....

  2. Theory, Methods and Tools for Statistical Testing of Pseudo and Quantum Random Number Generators

    OpenAIRE

    Jakobsson, Krister Sune

    2014-01-01

    Statistical random number testing is a well studied field focusing on pseudo-random number generators, that is to say algorithms that produce random-looking sequences of numbers. These generators tend to have certain kinds of flaws, which have been exploited through rigorous testing. Such testing has led to advancements, and today pseudo random number generators are both very high-speed and produce seemingly random numbers. Recent advancements in quantum physics have opened up new doors, wher...

  3. Performance of Generating Plant: Managing the Changes. Part 2: Thermal Generating Plant Unavailability Factors and Availability Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Curley, G. Michael [North American Electric Reliability Corporation (United States); Mandula, Jiri [International Atomic Energy Agency (IAEA)

    2008-05-15

    The WEC Committee on the Performance of Generating Plant (PGP) has been collecting and analysing power plant performance statistics worldwide for more than 30 years and has produced regular reports, which include examples of advanced techniques and methods for improving power plant performance through benchmarking. A series of reports from the various working groups was issued in 2008. This reference presents the results of Working Group 2 (WG2). WG2's main task is to facilitate the collection and input on an annual basis of power plant performance data (unit-by-unit and aggregated data) into the WEC PGP database. The statistics will be collected for steam, nuclear, gas turbine and combined cycle, hydro and pump storage plant. WG2 will also oversee the ongoing development of the availability statistics database, including the contents, the required software, security issues and other important information. The report is divided into two sections: Thermal generating, combined cycle/co-generation, combustion turbine, hydro and pumped storage unavailability factors and availability statistics; and nuclear power generating units.

  4. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. Exploring 'generative mechanisms' of the antiretroviral adherence club intervention using the realist approach: a scoping review of research-based antiretroviral treatment adherence theories.

    Science.gov (United States)

    Mukumbang, Ferdinand C; Van Belle, Sara; Marchal, Bruno; van Wyk, Brian

    2017-05-04

    Poor retention in care and non-adherence to antiretroviral therapy (ART) continue to undermine the success of HIV treatment and care programmes across the world. There is a growing recognition that multifaceted interventions - application of two or more adherence-enhancing strategies - may be useful to improve ART adherence and retention in care among people living with HIV/AIDS. Empirical evidence shows that multifaceted interventions produce better results than interventions based on a singular perspective. Nevertheless, the bundle of mechanisms by which multifaceted interventions promote ART adherence are poorly understood. In this paper, we reviewed theories on ART adherence to identify candidate/potential mechanisms by which the adherence club intervention works. We searched five electronic databases (PubMed, EBSCOhost, CINAHL, PsycARTICLES and Google Scholar) using Medical Subject Headings (MeSH) terms. A manual search of citations from the reference list of the studies identified from the electronic databases was also done. Twenty-six articles that adopted a theory-guided inquiry of antiretroviral adherence behaviour were included for the review. Eleven cognitive and behavioural theories underpinning these studies were explored. We examined each theory for possible 'generative causality' using the realist evaluation heuristic (Context-Mechanism-Outcome) configuration, then, we selected candidate mechanisms thematically. We identified three major sets of theories: Information-Motivation-Behaviour, Social Action Theory and Health Behaviour Model, which explain ART adherence. Although they show potential in explaining adherence bebahiours, they fall short in explaining exactly why and how the various elements they outline combine to explain positive or negative outcomes. Candidate mechanisms indentified were motivation, self-efficacy, perceived social support, empowerment, perceived threat, perceived benefits and perceived barriers. Although these candidate

  6. Identification of natural images and computer-generated graphics based on statistical and textural features.

    Science.gov (United States)

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.

  7. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  8. Computation of a leakage in a stream generator heating tube with realistic initial and boundary conditions; Berechnung eines Dampferzeugerheizrohrlecks mit realistischen Anfangs- und Randbedingungen

    Energy Technology Data Exchange (ETDEWEB)

    Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2009-02-15

    Within the framework of the safety review of a pressurized water reactor, TUeV NORD SysTec GmbH and Co.KG analyzed plant behavior for the case of a leakage in a steam generator heating tube by means of the ATHLET thermohydraulic code system using realistic initial and boundary conditions. The analysis was performed to show that operation of the safety injection pumps avoids 2 out of 3 emergency cooling criteria being triggered. After coolant transfer from the primary to the secondary side, activity retention is ensured only if the coolant is contained by the components of the secondary system. This requires the pressure in the failed steam generator to remain below the level of 88.3 bar at which the safety valves respond. Startup of the safety injection pumps would jeopardize activity retention because of the zero head of these pumps. Analysis indicated the filling level of the pressurizer to be above 3.6 m during the accident. The minimum margin from the ''pressurizer level <2.28 m'' reactor protection limit is around 1.3 m. Consequently, only the first of the 3 emergency cooling criteria (in this case, 'coolant pressure <132 bar') will respond. This avoids unwanted boosting of the coolant pressure due to connection of the safety injection pumps. By the end of the period of observation, approx. 36 Mg of coolant are transferred to the secondary side. Activity retention is ensured by the components of the secondary system. (orig.)

  9. Patch-based generative shape model and MDL model selection for statistical analysis of archipelagos

    DEFF Research Database (Denmark)

    Ganz, Melanie; Nielsen, Mads; Brandt, Sami

    2010-01-01

    We propose a statistical generative shape model for archipelago-like structures. These kind of structures occur, for instance, in medical images, where our intention is to model the appearance and shapes of calcifications in x-ray radio graphs. The generative model is constructed by (1) learning...... a patch-based dictionary for possible shapes, (2) building up a time-homogeneous Markov model to model the neighbourhood correlations between the patches, and (3) automatic selection of the model complexity by the minimum description length principle. The generative shape model is proposed...... as a probability distribution of a binary image where the model is intended to facilitate sequential simulation. Our results show that a relatively simple model is able to generate structures visually similar to calcifications. Furthermore, we used the shape model as a shape prior in the statistical segmentation...

  10. Quantum, classical and semiclassical analyses of photon statistics in harmonic generation

    CERN Document Server

    Bajer, J; Bajer, Jiri; Miranowicz, Adam

    2001-01-01

    In this review, we compare different descriptions of photon-number statistics in harmonic generation processes within quantum, classical and semiclassical approaches. First, we study the exact quantum evolution of the harmonic generation by applying numerical methods including those of Hamiltonian diagonalization and global characteristics. We show explicitly that the harmonic generations can indeed serve as a source of nonclassical light. Then, we demonstrate that the quasi-stationary sub-Poissonian light can be generated in these quantum processes under conditions corresponding to the so-called no-energy-transfer regime known in classical nonlinear optics. By applying method of classical trajectories, we demonstrate that the analytical predictions of the Fano factors are in good agreement with the quantum results. On comparing second and higher harmonic generations in the no-energy-transfer regime, we show that the highest noise reduction is achieved in third-harmonic generation with the Fano-factor of the ...

  11. Random number datasets generated from statistical analysis of randomly sampled GSM recharge cards.

    Science.gov (United States)

    Okagbue, Hilary I; Opanuga, Abiodun A; Oguntunde, Pelumi E; Ugwoke, Paulinus O

    2017-02-01

    In this article, a random number of datasets was generated from random samples of used GSM (Global Systems for Mobile Communications) recharge cards. Statistical analyses were performed to refine the raw data to random number datasets arranged in table. A detailed description of the method and relevant tests of randomness were also discussed.

  12. The Underlying Process Generating Lotka's Law and the Statistics of Exceedances.

    Science.gov (United States)

    Huber, John C.

    1998-01-01

    Demonstrates that the statistics of exceedances generates Lotka's Law--a widely-observed distribution of authors of scholarly papers and patents. The Frequency of production (papers or patents per year) and Lifetime (career duration) are exponentially distributed random variables. Empirical, phenomenological, and mathematical development shows…

  13. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    Energy Technology Data Exchange (ETDEWEB)

    Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-11

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimated stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.

  14. Selection of statistical distributions for prediction of steam generator tube degradation

    Energy Technology Data Exchange (ETDEWEB)

    Stavropoulos, K.D.; Gorman, J.A. [Dominion Engr., Inc., McLean, VA (United States); Staehle, R.W. [Univ. of Minnesota, Minneapolis, MN (United States); Welty, C.S. Jr. [Electric Power Research Institute, Palo Alto, CA (United States)

    1992-12-31

    This paper presents the first part of a project directed at developing methods for characterizing and predicting the progression of degradation of PWR steam generator tubes. This first part covers the evaluation of statistical distributions for use in such analyses. The data used in the evaluation of statistical distributions included data for primary water stress corrosion cracking (PWSCC) at roll transitions and U-bends, and intergranular attack/stress corrosion cracking (IGA/SCC) at tube sheet and tube support plate crevices. Laboratory data for PWSCC of reverse U-bends were also used. The review of statistical distributions indicated that the Weibull distribution provides an easy to use and effective method. Another statistical function, the log-normal, was found to provide essentially equivalent results. Two parameter fits, without an initiation time, were found to provide the most reliable predictions.

  15. Realistic simulation of the emergence of magnetic field generated in a solar convective dynamo from the convection zone into the corona

    Science.gov (United States)

    Chen, Feng; Rempel, Matthias D.; Fan, Yuhong

    2017-08-01

    We present a comprehensive realistic numerical model of emergence of magnetic flux generated in a solar convective dynamo from the convection zone to the corona. The magnetic and velocity fields in a horizontal layer near the top boundary of the solar convective dynamo simulation are used as a time-dependent bottom boundary to drive the radiation magnetohydrodynamic simulations of the emergence of the flux bundles through the upper most convection zone to more than 100 Mm above the surface of the Sun. The simualtion allows a direct comparison bewtween model synthesized observable and real obervations of flux emergence processes through different layers of the solar atmopshere.Emerging flux bundles bring more than 1e23 Mx flux to the photosphere in a period of about 50 hours and give rise to several active regions in a horizontal domain of 200 Mm. The mean corona temperature is about 1 MK for the quiet Sun and is significantly increased after active regions form at the photosphere. The flux emergence process produces a lot of dynamical features, such as coronal bright points, jets, waves and propagating disturbances, as well as flares and mass ejections. The biggest flare reaches M2.5 as indicated by synthetic GOES-15 soft X-ray flux. The total magnetic energy released during the eruption is about 5e31 ergs. The flare leads to a significant corona heating. The mean temperature in the coronal reaches more than 5 MK. And plasma in cusp-shaped post-flare loops is heated to several tens MK. The flare is accompanied by the ejection of a giant flux rope that carries cool and dense plasma. The flux rope is formed during the eruption by the reconnection between a sheared arcade that rises up from the low atmosphere above a bipolar sunspot pair and overlying fieldlines that are mostly perpendicular to the axis of the sheared arcade.

  16. The effects of realistic synaptic distribution and 3D geometry on signal integration and extracellular field generation of hippocampal pyramidal cells and inhibitory neurons

    Directory of Open Access Journals (Sweden)

    Attila I Gulyas

    2016-11-01

    Full Text Available In vivo and in vitro multichannel field and somatic intracellular recordings are frequently used to study mechanisms of network pattern generation. When interpreting these data, neurons are often implicitly considered as electrotonically compact cylinders with a homogeneous distribution of excitatory and inhibitory inputs. However, the actual distributions of dendritic length, diameter, and the densities of excitatory and inhibitory input are non-uniform and cell type-specific. We first review quantitative data on the dendritic structure and synaptic input and output distribution of pyramidal cells and interneurons in the hippocampal CA1 area. Second, using multicompartmental passive models of four different types of neurons, we quantitatively explore the effect of differences in dendritic structure and synaptic distribution on the errors and biases of voltage clamp measurements of inhibitory and excitatory postsynaptic currents. Finally, using the 3-dimensional distribution of dendrites and synaptic inputs we calculate how different inhibitory and excitatory inputs contribute to the generation of local field potential in the hippocampus. We analyze these effects at different realistic background activity levels as synaptic bombardment influences neuronal conductance and thus the propagation of signals in the dendritic tree.We conclude that, since dendrites are electrotonically long and entangled in 3D, somatic intracellular and field potential recordings miss the majority of dendritic events in some cell types, and thus overemphasize the importance of perisomatic inhibitory inputs and belittle the importance of complex dendritic processing. Modeling results also suggest that pyramidal cells and inhibitory neurons probably use different input integration strategies. In pyramidal cells, second- and higher-order thin dendrites are relatively well-isolated from each other, which may support branch-specific local processing as suggested by studies

  17. Exophobic quasi-realistic heterotic string vacua

    Energy Technology Data Exchange (ETDEWEB)

    Assel, Benjamin [Centre de Physique Theorique, Ecole Polytechnique, F-91128 Palaiseau (France); Christodoulides, Kyriakos [Dept. of Mathematical Sciences, University of Liverpool, Liverpool L69 7ZL (United Kingdom); Faraggi, Alon E., E-mail: faraggi@amtp.liv.ac.u [Dept. of Mathematical Sciences, University of Liverpool, Liverpool L69 7ZL (United Kingdom); Kounnas, Costas [Lab. Physique Theorique, Ecole Normale Superieure, F-75231 Paris 05 (France); Rizos, John [Department of Physics, University of Ioannina, GR45110 Ioannina (Greece)

    2010-01-25

    We demonstrate the existence of heterotic string vacua that are free of massless exotic fields. The need to break the non-Abelian GUT symmetries in k=1 heterotic string models by Wilson lines, while preserving the GUT embedding of the weak hypercharge and the GUT prediction sin{sup 2}theta{sub w}(M{sub GUT})=3/8, necessarily implies that the models contain states with fractional electric charge. Such states are severely restricted by observations, and must be confined or sufficiently massive and diluted. We construct the first quasi-realistic heterotic string models in which the exotic states do not appear in the massless spectrum, and only exist, as they must, in the massive spectrum. The SO(10) GUT symmetry is broken to the Pati-Salam subgroup. Our PS heterotic string models contain adequate Higgs representations to break the GUT and electroweak symmetry, as well as colour Higgs triplets that can be used for the missing partner mechanism. By statistically sampling the space of Pati-Salam vacua we demonstrate the abundance of quasi-realistic three generation models that are completely free of massless exotics, rendering it plausible that obtaining realistic Yukawa couplings may be possible in this space of models.

  18. Generation of statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd;

    2007-01-01

    Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...... generation. The approach is evaluated on the test case of a multi-MW wind farm over a period of more than two years. Its interest for a large range of applications is discussed....

  19. Finite-size scaling of two-point statistics and the turbulent energy cascade generators.

    Science.gov (United States)

    Cleve, Jochen; Dziekan, Thomas; Schmiegel, Jürgen; Barndorff-Nielsen, Ole E; Pearson, Bruce R; Sreenivasan, Katepalli R; Greiner, Martin

    2005-02-01

    Within the framework of random multiplicative energy cascade models of fully developed turbulence, finite-size-scaling expressions for two-point correlators and cumulants are derived, taking into account the observationally unavoidable conversion from an ultrametric to an Euclidean two-point distance. The comparison with two-point statistics of the surrogate energy dissipation, extracted from various wind tunnel and atmospheric boundary layer records, allows an accurate deduction of multiscaling exponents and cumulants, even at moderate Reynolds numbers for which simple power-law fits are not feasible. The extracted exponents serve as input for parametric estimates of the probabilistic cascade generator. Various cascade generators are evaluated.

  20. Experimental Section: On the magnetic field distribution generated by a dipolar current source situated in a realistically shaped compartment model of the head

    NARCIS (Netherlands)

    Meijs, J.W.H.; Bosch, F.G.C.; Peters, M.J.; Lopes da silva, F.H.

    1987-01-01

    The magnetic field distribution around the head is simulated using a realistically shaped compartment model of the head. The model is based on magnetic resonance images. The 3 compartments describe the brain, the skull and the scalp. The source is represented by a current dipole situated in the

  1. How the Mastery Rubric for Statistical Literacy Can Generate Actionable Evidence about Statistical and Quantitative Learning Outcomes

    Directory of Open Access Journals (Sweden)

    Rochelle E. Tractenberg

    2016-12-01

    Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.

  2. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  3. Factors influencing municipal solid waste generation in China: a multiple statistical analysis study.

    Science.gov (United States)

    Liu, Chen; Wu, Xin-wu

    2011-04-01

    A relationship between the waste production and socio-economic factors is essential in waste management. In the present study, the factors influencing municipal solid waste generation in China were investigated by multiple statistical analysis. Twelve items were chosen for investigation: GDP, per capita GDP, urban population, the proportion of urban population, the area of urban construction, the area of paved roads, the area of urban gardens and green areas, the number of the large cities, annual per capita disposable income of urban households, annual per capita consumption expenditure of urban households, total energy consumption and annual per capital consumption for households. Two methodologies from multiple statistical analysis were selected; specifically principal components analysis (PCA) and cluster analysis (CA). Three new dimensions were identified by PCA: component 1: economy and urban development; component 2: energy consumption; and component 3: urban scale. The three components together accounted for 99.1% of the initial variance. The results show that economy and urban development are important items influencing MSW generation. The proportion of urban population and urban population had the highest loadings in all factors. The relationship between growth of gross domestic product (GDP) and production of MSW was not as clear-cut as often assumed in China, a situation that is more likely to apply to developed countries. Energy consumption was another factor considered in our study of MSW generation. In addition, the annual MSW quantity variation was investigated by cluster analysis.

  4. Calculation of Tajima's D and other neutrality test statistics from low depth next-generation sequencing data

    DEFF Research Database (Denmark)

    Korneliussen, Thorfinn Sand; Moltke, Ida; Albrechtsen, Anders

    2013-01-01

    A number of different statistics are used for detecting natural selection using DNA sequencing data, including statistics that are summaries of the frequency spectrum, such as Tajima's D. These statistics are now often being applied in the analysis of Next Generation Sequencing (NGS) data. However......, estimates of frequency spectra from NGS data are strongly affected by low sequencing coverage; the inherent technology dependent variation in sequencing depth causes systematic differences in the value of the statistic among genomic regions....

  5. Generating action descriptions from statistically integrated representations of human motions and sentences.

    Science.gov (United States)

    Takano, Wataru; Kusajima, Ikuo; Nakamura, Yoshihiko

    2016-08-01

    It is desirable for robots to be able to linguistically understand human actions during human-robot interactions. Previous research has developed frameworks for encoding human full body motion into model parameters and for classifying motion into specific categories. For full understanding, the motion categories need to be connected to the natural language such that the robots can interpret human motions as linguistic expressions. This paper proposes a novel framework for integrating observation of human motion with that of natural language. This framework consists of two models; the first model statistically learns the relations between motions and their relevant words, and the second statistically learns sentence structures as word n-grams. Integration of these two models allows robots to generate sentences from human motions by searching for words relevant to the motion using the first model and then arranging these words in appropriate order using the second model. This allows making sentences that are the most likely to be generated from the motion. The proposed framework was tested on human full body motion measured by an optical motion capture system. In this, descriptive sentences were manually attached to the motions, and the validity of the system was demonstrated.

  6. Generating a robust statistical causal structure over 13 cardiovascular disease risk factors using genomics data.

    Science.gov (United States)

    Yazdani, Azam; Yazdani, Akram; Samiei, Ahmad; Boerwinkle, Eric

    2016-04-01

    Understanding causal relationships among large numbers of variables is a fundamental goal of biomedical sciences and can be facilitated by Directed Acyclic Graphs (DAGs) where directed edges between nodes represent the influence of components of the system on each other. In an observational setting, some of the directions are often unidentifiable because of Markov equivalency. Additional exogenous information, such as expert knowledge or genotype data can help establish directionality among the endogenous variables. In this study, we use the method of principle component analysis to extract information across the genome in order to generate a robust statistical causal network among phenotypes, the variables of primary interest. The method is applied to 590,020 SNP genotypes measured on 1596 individuals to generate the statistical causal network of 13 cardiovascular disease risk factor phenotypes. First, principal component analysis was used to capture information across the genome. The principal components were then used to identify a robust causal network structure, GDAG, among the phenotypes. Analyzing a robust causal network over risk factors reveals the flow of information in direct and alternative paths, as well as determining predictors and good targets for intervention. For example, the analysis identified BMI as influencing multiple other risk factor phenotypes and a good target for intervention to lower disease risk.

  7. Statistical analysis of regional capital and operating costs for electric power generation

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, L.R.; Myers, M.G.; Herrman, J.A.; Provanizano, A.J.

    1977-10-01

    This report presents the results of a three and one-half-month study conducted for Brookhaven National Lab. to develop capital and operating cost relationships for seven electric power generating technologies: oil-, coal-, gas-, and nuclear-fired steam-electric plants, hydroelectric plants, and gas-turbine plants. The methodology is based primarily on statistical analysis of Federal Power Commission data for plant construction and annual operating costs. The development of cost-output relationships for electric power generation is emphasized, considering the effects of scale, technology, and location on each of the generating processes investigated. The regional effects on cost are measured at the Census Region level to be consistent with the Brookhaven Multi-Regional Energy and Interindustry Regional Model of the United States. Preliminary cost relationships for system-wide costs - transmission, distribution, and general expenses - were also derived. These preliminary results cover the demand for transmission and distribution capacity and operating and maintenance costs in terms of system-service characteristics. 15 references, 6 figures, 23 tables.

  8. A STATISTICAL REVIEW OF DWPF LABORATORY MEASUREMENTS GENERATED DURING THE PROCESSING OF BATCHES 300 THROUGH 356

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T

    2006-08-31

    In this report, the Statistical Consulting Section (SCS) of the Savannah River National Laboratory (SRNL) provides summaries and comparisons of composition measurements for glass samples that were generated during the processing of batches 300 through 356 at the Defense Waste Processing Facility (DWPF). These analyses, which include measurements of samples from the Sludge Receipt and Adjustment Tank (SRAT) and the Slurry Mix Evaporator (SME) as well as samples of glass standards, were provided to SCS by the DWPF Laboratory (DWPF Lab) of Waste Laboratory Services. The comparisons made by SCS were extensive given that these data allowed for contrasts between preparation methods and between the two spectrometers that are currently in use at the DWPF Lab. In addition to general comparisons, specific questions that were posed in the Technical Task Request (TTR) behind this effort were addressed in this report.

  9. Statistical Ring Opening Metathesis Copolymerization of Norbornene and Cyclopentene by Grubbs’ 1st-Generation Catalyst

    Directory of Open Access Journals (Sweden)

    Christiana Nikovia

    2015-08-01

    Full Text Available Statistical copolymers of norbornene (NBE with cyclopentene (CP were prepared by ring-opening metathesis polymerization, employing the 1st-generation Grubbs’ catalyst, in the presence or absence of triphenylphosphine, PPh3. The reactivity ratios were estimated using the Finemann-Ross, inverted Finemann-Ross, and Kelen-Tüdos graphical methods, along with the computer program COPOINT, which evaluates the parameters of binary copolymerizations from comonomer/copolymer composition data by integrating a given copolymerization equation in its differential form. Structural parameters of the copolymers were obtained by calculating the dyad sequence fractions and the mean sequence length, which were derived using the monomer reactivity ratios. The kinetics of thermal decomposition of the copolymers along with the respective homopolymers was studied by thermogravimetric analysis within the framework of the Ozawa-Flynn-Wall and Kissinger methodologies. Finally, the effect of triphenylphosphine on the kinetics of copolymerization, the reactivity ratios, and the kinetics of thermal decomposition were examined.

  10. Statistical framework for detection of genetically modified organisms based on Next Generation Sequencing.

    Science.gov (United States)

    Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy

    2016-02-01

    Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops.

  11. CDFTBL: A statistical program for generating cumulative distribution functions from data

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, P.W. (Pacific Northwest Lab., Richland, WA (United States))

    1991-06-01

    This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs.

  12. Statistical downscaling and future scenario generation of temperatures for Pakistan Region

    Science.gov (United States)

    Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas

    2015-04-01

    Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.

  13. A generative statistical approach to automatic 3D building roof reconstruction from laser scanning data

    Science.gov (United States)

    Huang, Hai; Brenner, Claus; Sester, Monika

    2013-05-01

    This paper presents a generative statistical approach to automatic 3D building roof reconstruction from airborne laser scanning point clouds. In previous works, bottom-up methods, e.g., points clustering, plane detection, and contour extraction, are widely used. Due to the data artefacts caused by tree clutter, reflection from windows, water features, etc., the bottom-up reconstruction in urban areas may suffer from a number of incomplete or irregular roof parts. Manually given geometric constraints are usually needed to ensure plausible results. In this work we propose an automatic process with emphasis on top-down approaches. The input point cloud is firstly pre-segmented into subzones containing a limited number of buildings to reduce the computational complexity for large urban scenes. For the building extraction and reconstruction in the subzones we propose a pure top-down statistical scheme, in which the bottom-up efforts or additional data like building footprints are no more required. Based on a predefined primitive library we conduct a generative modeling to reconstruct roof models that fit the data. Primitives are assembled into an entire roof with given rules of combination and merging. Overlaps of primitives are allowed in the assembly. The selection of roof primitives, as well as the sampling of their parameters, is driven by a variant of Markov Chain Monte Carlo technique with specified jump mechanism. Experiments are performed on data-sets of different building types (from simple houses, high-rise buildings to combined building groups) and resolutions. The results show robustness despite the data artefacts mentioned above and plausibility in reconstruction.

  14. Shapes and Statistics of the Rogue Waves Generated by Chaotic Ocean Current

    CERN Document Server

    Bayindir, Cihan

    2015-01-01

    In this study we discuss the shapes and statistics of the rogue (freak) waves emerging due to wave-current interactions. With this purpose, we use a simple governing equation which is a nonlinear Schrodinger equation (NLSE) extended by R. Smith (1976). This extended NLSE accounts for the effects of current gradient on the nonlinear dynamics of the ocean surface near blocking point. Using a split-step scheme we show that the extended NLSE of Smith is unstable against random chaotic perturbation in the current profile. Therefore the monochromatic wave field with unit amplitude turns into a chaotic sea state with many peaks. By comparing the numerical and analytical results, we show that rogue waves due to perturbations in the current profile are in the form of rational rogue wave solutions of the NLSE. We also discuss the effects of magnitude of the chaotic current profile perturbations on the statistics of the rogue wave generation at the ocean surface. The extension term in Smith's extended NLSE causes phase ...

  15. Statistical mechanics of logarithmic REM: duality, freezing and extreme value statistics of 1/f noises generated by Gaussian free fields

    Science.gov (United States)

    Fyodorov, Yan V.; Le Doussal, Pierre; Rosso, Alberto

    2009-10-01

    We compute the distribution of the partition functions for a class of one-dimensional random energy models with logarithmically correlated random potential, above and at the glass transition temperature. The random potential sequences represent various versions of the 1/f noise generated by sampling the two-dimensional Gaussian free field (2D GFF) along various planar curves. Our method extends the recent analysis of Fyodorov and Bouchaud (2008 J. Phys. A: Math. Theor. 41 372001) from the circular case to an interval and is based on an analytical continuation of the Selberg integral. In particular, we unveil a duality relation satisfied by the suitable generating function of free energy cumulants in the high temperature phase. It reinforces the freezing scenario hypothesis for that generating function, from which we derive the distribution of extrema for the 2D GFF on the [0,1] interval. We provide numerical checks of the circular case and the interval case and discuss universality and various extensions. The relevance to the distribution of the length of a segment in Liouville quantum gravity is noted.

  16. A Visual Basic Program to Generate Sediment Grain-Size Statistics and Extrapolate Particle Distributions

    Science.gov (United States)

    Poppe, L. J.; Eliason, A. E.; Hastings, M. E.

    2004-05-01

    Methods that describe and summarize grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Therefore, to facilitate reduction of sedimentologic data, we have written a computer program (GSSTAT) to generate grain-size statistics and extrapolate particle distributions. Our program is written in Microsoft Visual Basic 6.0, runs on Windows 95/98/ME/NT/2000/XP computers, provides a window to facilitate execution, and allows users to select options with mouse-click events or through interactive dialogue boxes. The program permits users to select output in either inclusive graphics or moment statistics, to extrapolate distributions to the colloidal-clay boundary by three methods, and to convert between frequency and cumulative frequency percentages. Detailed documentation is available within the program. Input files to the program must be comma-delimited ASCII text and have 20 fields that include: sample identifier, latitude, longitude, and the frequency or cumulative frequency percentages of the whole-phi fractions from 11 phi through -5 phi. Individual fields may be left blank, but the sum of the phi fractions must total 100% (+/- 0.2%). The program expects the first line of the input file to be a header showing attribute names; no embedded commas are allowed in any of the fields. Error messages warn the user of potential problems. The program generates an output file in the requested destination directory and allows the user to view results in a display window to determine the occurrence of errors. The output file has a header for its first line, but now has 34 fields; the original descriptor fields plus percentages of gravel, sand, silt and clay, statistics, classification, verbal descriptions, frequency or cumulative frequency percentages of the whole- phi fractions from 13 phi through -5 phi, and a field for error messages. If the user has selected extrapolation, the two additional phi

  17. MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.

    Science.gov (United States)

    Alic, Andy S; Blanquer, Ignacio

    2016-09-01

    Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.

  18. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  19. Substorm associated radar auroral surges: a statistical study and possible generation model

    Directory of Open Access Journals (Sweden)

    B. A. Shand

    Full Text Available Substorm-associated radar auroral surges (SARAS are a short lived (15–90 minutes and spatially localised (~5° of latitude perturbation of the plasma convection pattern observed within the auroral E-region. The understanding of such phenomena has important ramifications for the investigation of the larger scale plasma convection and ultimately the coupling of the solar wind, magnetosphere and ionosphere system. A statistical investigation is undertaken of SARAS, observed by the Sweden And Britain Radar Experiment (SABRE, in order to provide a more extensive examination of the local time occurrence and propagation characteristics of the events. The statistical analysis has determined a local time occurrence of observations between 1420 MLT and 2200 MLT with a maximum occurrence centred around 1700 MLT. The propagation velocity of the SARAS feature through the SABRE field of view was found to be predominately L-shell aligned with a velocity centred around 1750 m s–1 and within the range 500 m s–1 and 3500 m s–1. This comprehensive examination of the SARAS provides the opportunity to discuss, qualitatively, a possible generation mechanism for SARAS based on a proposed model for the production of a similar phenomenon referred to as sub-auroral ion drifts (SAIDs. The results of the comparison suggests that SARAS may result from a similar geophysical mechanism to that which produces SAID events, but probably occurs at a different time in the evolution of the event.

    Key words. Substorms · Auroral surges · Plasma con-vection · Sub-auroral ion drifts

  20. THE APPLICATION OF STATISTICAL PARAMETERS OF PHASE RESOLVED PD DISTRIBUTION TO AGING EXTENT ASSESSMENT OF LARGE GENERATOR INSULATION

    Institute of Scientific and Technical Information of China (English)

    谢恒堃; 乐波; 孙翔; 宋建成

    2003-01-01

    Objective To investigate the characteristic parameters employed to describe the aging extent of stator insulation of large generator and study the aging laws. Methods Multi-stress aging tests of model generator stator bar specimens were performed and PD measurements were conducted using digital PD detector with frequency range from 40*!kHz to 400*!kHz at different aging stage. Results From the test results of model specimens it was found that the skewness of phase resolved PD distribution might be taken as the characterization parameters for aging extent assessment of generator insulation. Furthermore, the measurement results of actual generator stator bars showed that the method based on statistical parameters of PD distributions are prospective for aging extent assessment and residual lifetime estimation of large generator insulation. Conclusion Statistical parameters of phase resolved PD distribution was proposed for aging extent assessment of large generator insulation.

  1. Chapter 3 – Phenomenology of Tsunamis: Statistical Properties from Generation to Runup

    Science.gov (United States)

    Geist, Eric L.

    2015-01-01

    Observations related to tsunami generation, propagation, and runup are reviewed and described in a phenomenological framework. In the three coastal regimes considered (near-field broadside, near-field oblique, and far field), the observed maximum wave amplitude is associated with different parts of the tsunami wavefield. The maximum amplitude in the near-field broadside regime is most often associated with the direct arrival from the source, whereas in the near-field oblique regime, the maximum amplitude is most often associated with the propagation of edge waves. In the far field, the maximum amplitude is most often caused by the interaction of the tsunami coda that develops during basin-wide propagation and the nearshore response, including the excitation of edge waves, shelf modes, and resonance. Statistical distributions that describe tsunami observations are also reviewed, both in terms of spatial distributions, such as coseismic slip on the fault plane and near-field runup, and temporal distributions, such as wave amplitudes in the far field. In each case, fundamental theories of tsunami physics are heuristically used to explain the observations.

  2. Statistical analysis of data from limiting dilution cloning to assess monoclonality in generating manufacturing cell lines.

    Science.gov (United States)

    Quiroz, Jorge; Tsao, Yung-Shyeng

    2016-07-08

    Assurance of monoclonality of recombinant cell lines is a critical issue to gain regulatory approval in biological license application (BLA). Some of the requirements of regulatory agencies are the use of proper documentations and appropriate statistical analysis to demonstrate monoclonality. In some cases, one round may be sufficient to demonstrate monoclonality. In this article, we propose the use of confidence intervals for assessing monoclonality for limiting dilution cloning in the generation of recombinant manufacturing cell lines based on a single round. The use of confidence intervals instead of point estimates allow practitioners to account for the uncertainty present in the data when assessing whether an estimated level of monoclonality is consistent with regulatory requirements. In other cases, one round may not be sufficient and two consecutive rounds are required to assess monoclonality. When two consecutive subclonings are required, we improved the present methodology by reducing the infinite series proposed by Coller and Coller (Hybridoma 1983;2:91-96) to a simpler series. The proposed simpler series provides more accurate and reliable results. It also reduces the level of computation and can be easily implemented in any spreadsheet program like Microsoft Excel. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1061-1068, 2016.

  3. Evaluation of local electric fields generated by transcranial direct current stimulation with an extracephalic reference electrode based on realistic 3D body modeling

    Science.gov (United States)

    Im, Chang-Hwan; Park, Ji-Hye; Shim, Miseon; Chang, Won Hyuk; Kim, Yun-Hee

    2012-04-01

    In this study, local electric field distributions generated by transcranial direct current stimulation (tDCS) with an extracephalic reference electrode were evaluated to address extracephalic tDCS safety issues. To this aim, we generated a numerical model of an adult male human upper body and applied the 3D finite element method to electric current conduction analysis. In our simulations, the active electrode was placed over the left primary motor cortex (M1) and the reference electrode was placed at six different locations: over the right temporal lobe, on the right supraorbital region, on the right deltoid, on the left deltoid, under the chin, and on the right buccinator muscle. The maximum current density and electric field intensity values in the brainstem generated by the extracephalic reference electrodes were comparable to, or even less than, those generated by the cephalic reference electrodes. These results suggest that extracephalic reference electrodes do not lead to unwanted modulation of the brainstem cardio-respiratory and autonomic centers, as indicated by recent experimental studies. The volume energy density was concentrated at the neck area by the use of deltoid reference electrodes, but was still smaller than that around the active electrode locations. In addition, the distributions of elicited cortical electric fields demonstrated that the use of extracephalic reference electrodes might allow for the robust prediction of cortical modulations with little dependence on the reference electrode locations.

  4. A Review of Study Designs and Statistical Methods for Genomic Epidemiology Studies using Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Qian eWang

    2015-04-01

    Full Text Available Results from numerous linkage and association studies have greatly deepened scientists’ understanding of the genetic basis of many human diseases, yet some important questions remain unanswered. For example, although a large number of disease-associated loci have been identified from genome-wide association studies (GWAS in the past 10 years, it is challenging to interpret these results as most disease-associated markers have no clear functional roles in disease etiology, and all the identified genomic factors only explain a small portion of disease heritability. With the help of next-generation sequencing (NGS, diverse types of genomic and epigenetic variations can be detected with high accuracy. More importantly, instead of using linkage disequilibrium to detect association signals based on a set of pre-set probes, NGS allows researchers to directly study all the variants in each individual, therefore promises opportunities for identifying functional variants and a more comprehensive dissection of disease heritability. Although the current scale of NGS studies is still limited due to the high cost, the success of several recent studies suggests the great potential for applying NGS in genomic epidemiology, especially as the cost of sequencing continues to drop. In this review, we discuss several pioneer applications of NGS, summarize scientific discoveries for rare and complex diseases, and compare various study designs including targeted sequencing and whole-genome sequencing using population-based and family-based cohorts. Finally, we highlight recent advancements in statistical methods proposed for sequencing analysis, including group-based association tests, meta-analysis techniques, and annotation tools for variant prioritization.

  5. On Realistic Demand of the New Generation of Migrant Workers in China%我国新生代农民工的现实诉求探析

    Institute of Scientific and Technical Information of China (English)

    宋平

    2015-01-01

    This article summed up the new generation of migrant workers ’ real life demands from six aspects:the desire for human care ,cultural education needs ,their own development ,integration into the city ,the maintenance of the rights and interests ,participation in politics and other aspects .And discussed the main reason of the new genera-tion of migrant workers from six dimensions , put forward the major way to solve the reasonable demands so as to cause the social concern and promote the society to be more harmonious and stable .%从渴望人文关怀、文化教育需求、自身发展、融入城市、维护权益、参与政治等六个方面归纳了我国新生代农民工对现实生活的诉求,又从六个维度探讨了新生代农民工诉求的主要原因,提出了解决其合理诉求的主要途径,以期引起政府部门和社会的关注,帮助新生代农民工走出利益诉求困境,促进全社会和谐稳定。

  6. Kuhn: Realist or Antirealist?

    Directory of Open Access Journals (Sweden)

    Michel Ghins

    1998-06-01

    Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.

  7. Testing for Multivariate Normality in Mass Spectrometry Imaging Data: A Robust Statistical Approach for Clustering Evaluation and the Generation of Synthetic Mass Spectrometry Imaging Data Sets.

    Science.gov (United States)

    Dexter, Alex; Race, Alan M; Styles, Iain B; Bunch, Josephine

    2016-11-15

    Spatial clustering is a powerful tool in mass spectrometry imaging (MSI) and has been demonstrated to be capable of differentiating tumor types, visualizing intratumor heterogeneity, and segmenting anatomical structures. Several clustering methods have been applied to mass spectrometry imaging data, but a principled comparison and evaluation of different clustering techniques presents a significant challenge. We propose that testing whether the data has a multivariate normal distribution within clusters can be used to evaluate the performance when using algorithms that assume normality in the data, such as k-means clustering. In cases where clustering has been performed using the cosine distance, conversion of the data to polar coordinates prior to normality testing should be performed to ensure normality is tested in the correct coordinate system. In addition to these evaluations of internal consistency, we demonstrate that the multivariate normal distribution can then be used as a basis for statistical modeling of MSI data. This allows the generation of synthetic MSI data sets with known ground truth, providing a means of external clustering evaluation. To demonstrate this, reference data from seven anatomical regions of an MSI image of a coronal section of mouse brain were modeled. From this, a set of synthetic data based on this model was generated. Results of r(2) fitting of the chi-squared quantile-quantile plots on the seven anatomical regions confirmed that the data acquired from each spatial region was found to be closer to normally distributed in polar space than in Euclidean. Finally, principal component analysis was applied to a single data set that included synthetic and real data. No significant differences were found between the two data types, indicating the suitability of these methods for generating realistic synthetic data.

  8. Plane polymer configurations enclosing a fixed area in an electric field: generating functional and statistical mechanical properties

    NARCIS (Netherlands)

    Khandekar, D.C.; Wiegel, F.W.

    1993-01-01

    The statistical mechanical properties of plane polymer configurations which enclose a fixed area and are subject to an external electric field are investigated. For this purpose an exact expression for the generating functional is obtained and subsequently used to derive: (a) the distribution functi

  9. Realist trials and the testing of context-mechanism-outcome configurations: a response to Van Belle et al.

    Science.gov (United States)

    Bonell, Chris; Warren, Emily; Fletcher, Adam; Viner, Russell

    2016-10-01

    Van Belle et al. argue that our attempt to pursue realist evaluation via a randomised trial will be fruitless because we misunderstand realist ontology (confusing intervention mechanisms with intervention activities and with statistical mediation analyses) and because RCTs cannot comprehensively examine how and why outcome patterns are caused by mechanisms triggered in specific contexts. Through further consideration of our trial methods, we explain more fully how we believe complex social interventions work and what realist evaluation should aim to do within a trial. Like other realists, those undertaking realist trials assume that: social interventions provide resources which local actors may draw on in actions that can trigger mechanisms; these mechanisms may interact with contextual factors to generate outcomes; and data in the 'empirical' realm can be used to test hypotheses about mechanisms in the 'real' realm. Whether or not there is sufficient contextual diversity to test such hypotheses is a contingent not a necessary feature of trials. Previous exemplars of realist evaluation have compared empirical data from intervention and control groups to test hypotheses about real mechanisms. There is no inevitable reason why randomised trials should not also be able to do so. Random allocation merely ensures the comparability of such groups without necessarily causing evaluation to lapse from a realist into a 'positivist' or 'post-positivist' paradigm. Realist trials are ontologically and epistemologically plausible. Further work is required to assess whether they are feasible and useful but such work should not be halted on spurious philosophical grounds.

  10. Special section: Statistical methods for next-generation gene sequencing data

    OpenAIRE

    Kafadar, Karen

    2012-01-01

    This issue includes six articles that develop and apply statistical methods for the analysis of gene sequencing data of different types. The methods are tailored to the different data types and, in each case, lead to biological insights not readily identified without the use of statistical methods. A common feature in all articles is the development of methods for analyzing simultaneously data of different types (e.g., genotype, phenotype, pedigree, etc.); that is, using data of one type to i...

  11. A Statistical Model for Hourly Large-Scale Wind and Photovoltaic Generation in New Locations

    DEFF Research Database (Denmark)

    Ekstrom, Jussi; Koivisto, Matti Juhani; Mellin, Ilkka

    2017-01-01

    The analysis of large-scale wind and photovoltaic (PV) energy generation is of vital importance in power systems where their penetration is high. This paper presents a modular methodology to assess the power generation and volatility of a system consisting of both PV plants (PVPs) and wind power ...

  12. Association testing for next-generation sequencing data using score statistics

    DEFF Research Database (Denmark)

    Skotte, Line; Korneliussen, Thorfinn Sand; Albrechtsen, Anders

    2012-01-01

    of genotype calls into account have been proposed; most require numerical optimization which for large-scale data is not always computationally feasible. We show that using a score statistic for the joint likelihood of observed phenotypes and observed sequencing data provides an attractive approach...... computationally feasible due to the use of score statistics. As part of the joint likelihood, we model the distribution of the phenotypes using a generalized linear model framework, which works for both quantitative and discrete phenotypes. Thus, the method presented here is applicable to case-control studies...

  13. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools

    NARCIS (Netherlands)

    Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke

    2007-01-01

    Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,

  14. A note on “Generalized superposition of two squeezed states: generation and statistical properties”

    OpenAIRE

    Avelar, A. T.; Malbouisson, J.M.C.; Baseia, B.

    2004-01-01

    Texto completo: acesso restrito. p. 139-143 A previous scheme [Physica A 280 (2003) 346] showed how to create a generalized superposition of two squeezed states for stationary fields and studied its statistical properties. Here we show how to extend this result for travelling fields.

  15. QQ-plots for assessing distributions of biomarker measurements and generating defensible summary statistics

    Science.gov (United States)

    One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...

  16. Statistical data generated through CFD to aid in the scale-up of shear sensitive processes

    Science.gov (United States)

    Khan, Irfan; Das, Shankhadeep; Cloeter, Mike; Gillis, Paul; Poindexter, Michael

    2016-11-01

    A number of industrial processes are considered shear-sensitive, where the product quality depends on achieving the right balance between mixing energy input and the resulting strain rate distribution in the process. Examples of such industrial processes are crystallization, flocculation and suspension polymerization. Scale-up of such processes are prone to a number of challenges including the optimization of mixing and shear rate distribution in the process. Computational Fluid Dynamics (CFD) can be a valuable tool to aid in the process scale-up; however for modeling purpose, the process will often need to be simplified appropriately to reduce the computational complexity. Commercial CFD tools with appropriate Lagrangian particle tracking models can be used to gather statistical data such as maximum strain rate distribution and maximum number of passes through a specific strain rate. This presentation will discuss such statistical tools and their application to a model scale-up problem.

  17. Generation of high frequency photons with sub-Poissonian statistics at consecutive interactions

    CERN Document Server

    Chirkin, A S

    2003-01-01

    The process of parametric amplification at high frequency pumping, which is accompanied by optical frequency mixing in the same nonlinear crystal (NC), is considered. It is shown that if a signal wave is in a coherent state at the input of the NC, then the radiation with signal and summary frequencies can have sub-Poissonian photon statistics at the output of the NC in the deamplification regime. The Fano factors as functions of parameters of the problem are studied.

  18. A review and critique of the statistical methods used to generate reference values in pediatric echocardiography.

    Science.gov (United States)

    Mawad, Wadi; Drolet, Christian; Dahdah, Nagib; Dallaire, Frederic

    2013-01-01

    Several articles have proposed echocardiographic reference values in normal pediatric subjects, but adequate validation is often lacking and has not been reviewed. The aim of this study was to review published reference values in pediatric two-dimensional and M-mode echocardiography with a specific focus on the adequacy of the statistical and mathematical methods used to normalize echocardiographic measurements. All articles proposing reference values for transthoracic pediatric echocardiography were reviewed. The types of measurements, the methods of normalization, the regression models used, and the methods used to detect potential bias in proposed reference values were abstracted. The detection of residual associations, residual heteroscedasticity, and departures from the normal distribution theory predictions were specifically analyzed. Fifty-two studies met the inclusion criteria. Most authors (87%) used parametric normalization to account for body size, but their approaches were very heterogeneous. Linear regression and indexing were the most common models. Heteroscedasticity was often present but was mentioned in only 27% of studies. The absence of residual heteroscedasticity and residual associations between the normalized measurements and the independent variables were mentioned in only 9% and 22% of the studies, respectively. Only 14% of studies documented that the distribution of the residual values was appropriate for Z score calculation or that the proportion of subjects falling outside the reference range was appropriate. Statistical suitability of the proposed reference ranges was often incompletely documented. This review underlines the great need for better standardization in echocardiographic measurement normalization.

  19. A Statistical Evaluation of Multiplicative Congruential Generators with Modulus 2 Super 31-1.

    Science.gov (United States)

    1978-12-01

    Coveyou and MacPherson, 1967) and the lattice tes t (Marsa gl ia,197l). It is well known ( Marsaglia , 1968) that all linear con- gruential generators are...criteria suggested in Knuth (1969) and Marsaglia (1971) for a multipl ier to “pass ” the spectral and lattice tests, respectively, may be considerably...3.18 1.59 I. asource: For RANDU, ( Marsaglia 1971). For Multipliers I through XVI , (Hoaglln 1976). 1.1

  20. Generation of Synthetic Transcriptome Data with Defined Statistical Properties for the Development and Testing of New Analysis Methods

    Institute of Scientific and Technical Information of China (English)

    Guillaume Brysbaert; Sebastian Noth; Arndt Benecke

    2007-01-01

    We have previously developed a combined signal/variance distribution model that accounts for the particular statistical properties of datasets generated on the Applied Biosystems AB1700 transcriptome system. Here we show that this model can be efficiently used to generate synthetic datasets with statistical properties virtually identical to those of the actual data by aid of the JAVA application ace.map creator 1.0 that we have developed. The fundamentally different structure of AB1700 transcriptome profiles requires re-evaluation, adaptation, or even redevelopment of many of the standard microarray analysis methods in order to avoid misinterpretation of the data on the one hand, and to draw full benefit from their increased specificity and sensitivity on the other hand. Our composite data model and the ace.map creator 1.0 application thereby not only present proof of the correctness of our parameter estimation, but also provide a tool for the generation of synthetic test data that will be useful for further development and testing of analysis methods.

  1. The Statistical Evolution of Multiple Generations of Oxidation Products in the Photochemical Aging of Chemically Reduced Organic Aerosol

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Kevin R.; Smith, Jared D.; Kessler, Sean; Kroll, Jesse H.

    2011-10-03

    The heterogeneous reaction of hydroxyl radicals (OH) with squalane and bis(2-ethylhexyl) sebacate (BES) particles are used as model systems to examine how distributions of reactionproducts evolve during the oxidation of chemically reduced organic aerosol. A kinetic model of multigenerational chemistry, which is compared to previously measured (squalane) and new(BES) experimental data, reveals that it is the statistical mixtures of different generations of oxidation products that control the average particle mass and elemental composition during thereaction. The model suggests that more highly oxidized reaction products, although initially formed with low probability, play a large role in the production of gas phase reaction products.In general, these results highlight the importance of considering atmospheric oxidation as a statistical process, further suggesting that the underlying distribution of molecules could playimportant roles in aerosol formation as well as in the evolution of key physicochemical properties such as volatility and hygroscopicity.

  2. Statistical mechanics of sum frequency generation spectroscopy for the liquid-vapor interface of dilute aqueous salt solutions

    Energy Technology Data Exchange (ETDEWEB)

    Noah-Vanhoucke, Joyce; Smith, Jared D.; Geissler, Phillip L.

    2009-01-02

    We demonstrate a theoretical description of vibrational sum frequency generation (SFG) at the boundary of aqueous electrolyte solutions. This approach identifies and exploits a simple relationship between SFG lineshapes and the statistics of molecular orientation and electric field. Our computer simulations indicate that orientational averages governing SFG susceptibility do not manifest ion-specific shifts in local electric field, but instead, ion-induced polarization of subsurface layers. Counterbalancing effects are obtained for monovalent anions and cations at the same depth. Ions held at different depths induce an imbalanced polarization, suggesting that ion-specific effects can arise from weak, long ranged influence on solvent organization.

  3. Statistical mechanics of sum frequency generation spectroscopy for the liquid-vapor interface of dilute aqueous salt solutions

    Science.gov (United States)

    Noah-Vanhoucke, Joyce; Smith, Jared D.; Geissler, Phillip L.

    2009-02-01

    We demonstrate a theoretical description of vibrational sum frequency generation (SFG) at the boundary of aqueous electrolyte solutions. This approach identifies and exploits a simple relationship between SFG lineshapes and the statistics of molecular orientation and electric field. Our computer simulations indicate that orientational averages governing SFG susceptibility do not manifest ion-specific shifts in local electric field, but instead, ion-induced polarization of subsurface layers. Counterbalancing effects are obtained for monovalent anions and cations at the same depth. Ions held at different depths induce an imbalanced polarization, suggesting that ion-specific effects can arise from weak, long-ranged influence on solvent organization.

  4. Statistical characteristic in time-domain of direct current corona-generated audible noise from conductor in corona cage

    Science.gov (United States)

    Li, Xuebao; Cui, Xiang; Lu, Tiebing; Ma, Wenzuo; Bian, Xingming; Wang, Donglai; Hiziroglu, Huseyin

    2016-03-01

    The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.

  5. Statistical characteristic in time-domain of direct current corona-generated audible noise from conductor in corona cage

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xuebao, E-mail: lxb08357x@ncepu.edu.cn; Cui, Xiang, E-mail: x.cui@ncepu.edu.cn; Ma, Wenzuo; Bian, Xingming; Wang, Donglai [State Key Laboratory of Alternate Electrical Power System with Renewable Energy Sources, North China Electric Power University, Beijing 102206 (China); Lu, Tiebing, E-mail: tiebinglu@ncepu.edu.cn [Beijing Key Laboratory of High Voltage and EMC, North China Electric Power University, Beijing 102206 (China); Hiziroglu, Huseyin [Department of Electrical and Computer Engineering, Kettering University, Flint, Michigan 48504 (United States)

    2016-03-15

    The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.

  6. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    Science.gov (United States)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  7. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  8. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  9. Statistical analysis plan for the WOMAN-ETAPlaT study: Effect of tranexamic acid on platelet function and thrombin generation.

    Science.gov (United States)

    Dallaku, Kastriot; Shakur, Haleema; Edwards, Phil; Beaumont, Danielle; Roberts, Ian; Huque, Sumaya; Delius, Maria; Mansmann, Ulrich

    2016-12-15

    Background. Postpartum haemorrhage (PPH) is a potentially life-threatening complication for women, and the leading cause of maternal mortality. Tranexamic acid (TXA) is an antifibrinolytic used worldwide to treat uterine haemorrhage and to reduce blood loss in general surgery. TXA may have effects on thrombin generation, platelet function and coagulation factors as a result of its inhibition on the plasmin. Methods. WOMAN ETAPlaT is a sub-study of the World Maternal Antifibrinolitic trial (WOMAN trial). All adult women clinically diagnosed with PPH after a vaginal delivery or caesarean section, are eligible for inclusion in the study. Blood samples will be collected at the baseline and 30 minutes after the first dose of study treatment is given. Platelet function will be evaluated in whole blood immediately after sampling with Multiplate® tests (ADPtest and TRAPtest). Thrombin generation, fibrinogen, D-dimer, and coagulation factors vW, V and VIII will be analysed using platelet poor plasma. Results. Recruitment to WOMAN ETAPlaT started on 04 November 2013 and closed on 13 January 2015, during this time  188 patients were recruited. The final participant follow-up was completed on 04 March 2015. This article introduces the statistical analysis plan for the study, without reference to unblinded data.   Conclusion. The data from this study will provide evidence for the effect of TXA on thrombin generation, platelet function and coagulation factors in women with PPH. Trial registration: ClinicalTrials.gov Identifier: NCT00872469; ISRCTN76912190.

  10. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  11. Generation of a statistical shape model with probabilistic point correspondences and the expectation maximization- iterative closest point algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Heike [Institut National de Recherche en Informatique et en Automatique (INRIA), Asclepios Project, Sophia Antipolis (France); University Medical Center Hamburg-Eppendorf, Department of Medical Informatics, Hamburg (Germany); Pennec, Xavier; Ayache, Nicholas [Institut National de Recherche en Informatique et en Automatique (INRIA), Asclepios Project, Sophia Antipolis (France); Ehrhardt, Jan; Handels, Heinz [University Medical Center Hamburg-Eppendorf, Department of Medical Informatics, Hamburg (Germany)

    2008-03-15

    ;specificity', the estimates were very satisfactory. The novel algorithm for building a generative statistical shape model (gSSM) does not need one-to-one point correspondences but relies solely on point correspondence probabilities for the computation of mean shape and eigenmodes. It is well-suited for shape analysis on unstructured point sets. (orig.)

  12. RAMESES publication standards: realist syntheses

    Directory of Open Access Journals (Sweden)

    Wong Geoff

    2013-01-01

    Full Text Available Abstract Background There is growing interest in realist synthesis as an alternative systematic review method. This approach offers the potential to expand the knowledge base in policy-relevant areas - for example, by explaining the success, failure or mixed fortunes of complex interventions. No previous publication standards exist for reporting realist syntheses. This standard was developed as part of the RAMESES (Realist And MEta-narrative Evidence Syntheses: Evolving Standards project. The project's aim is to produce preliminary publication standards for realist systematic reviews. Methods We (a collated and summarized existing literature on the principles of good practice in realist syntheses; (b considered the extent to which these principles had been followed by published syntheses, thereby identifying how rigor may be lost and how existing methods could be improved; (c used a three-round online Delphi method with an interdisciplinary panel of national and international experts in evidence synthesis, realist research, policy and/or publishing to produce and iteratively refine a draft set of methodological steps and publication standards; (d provided real-time support to ongoing realist syntheses and the open-access RAMESES online discussion list so as to capture problems and questions as they arose; and (e synthesized expert input, evidence syntheses and real-time problem analysis into a definitive set of standards. Results We identified 35 published realist syntheses, provided real-time support to 9 on-going syntheses and captured questions raised in the RAMESES discussion list. Through analysis and discussion within the project team, we summarized the published literature and common questions and challenges into briefing materials for the Delphi panel, comprising 37 members. Within three rounds this panel had reached consensus on 19 key publication standards, with an overall response rate of 91%. Conclusion This project used multiple

  13. A Self-Organizing Map-Based Approach to Generating Reduced-Size, Statistically Similar Climate Datasets

    Science.gov (United States)

    Cabell, R.; Delle Monache, L.; Alessandrini, S.; Rodriguez, L.

    2015-12-01

    Climate-based studies require large amounts of data in order to produce accurate and reliable results. Many of these studies have used 30-plus year data sets in order to produce stable and high-quality results, and as a result, many such data sets are available, generally in the form of global reanalyses. While the analysis of these data lead to high-fidelity results, its processing can be very computationally expensive. This computational burden prevents the utilization of these data sets for certain applications, e.g., when rapid response is needed in crisis management and disaster planning scenarios resulting from release of toxic material in the atmosphere. We have developed a methodology to reduce large climate datasets to more manageable sizes while retaining statistically similar results when used to produce ensembles of possible outcomes. We do this by employing a Self-Organizing Map (SOM) algorithm to analyze general patterns of meteorological fields over a regional domain of interest to produce a small set of "typical days" with which to generate the model ensemble. The SOM algorithm takes as input a set of vectors and generates a 2D map of representative vectors deemed most similar to the input set and to each other. Input predictors are selected that are correlated with the model output, which in our case is an Atmospheric Transport and Dispersion (T&D) model that is highly dependent on surface winds and boundary layer depth. To choose a subset of "typical days," each input day is assigned to its closest SOM map node vector and then ranked by distance. Each node vector is treated as a distribution and days are sampled from them by percentile. Using a 30-node SOM, with sampling every 20th percentile, we have been able to reduce 30 years of the Climate Forecast System Reanalysis (CFSR) data for the month of October to 150 "typical days." To estimate the skill of this approach, the "Measure of Effectiveness" (MOE) metric is used to compare area and overlap

  14. Realistic face modeling based on multiple deformations

    Institute of Scientific and Technical Information of China (English)

    GONG Xun; WANG Guo-yin

    2007-01-01

    On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.

  15. Probabilistic Generative Models for the Statistical Inference of Unobserved Paleoceanographic Events: Application to Stratigraphic Alignment for Inference of Ages

    Science.gov (United States)

    Lawrence, C.; Lin, L.; Lisiecki, L. E.; Khider, D.

    2014-12-01

    The broad goal of this presentation is to demonstrate the utility of probabilistic generative models to capture investigators' knowledge of geological processes and proxy data to draw statistical inferences about unobserved paleoclimatological events. We illustrate how this approach forces investigators to be explicit about their assumptions, and about how probability theory yields results that are a mathematical consequence of these assumptions and the data. We illustrate these ideas with the HMM-Match model that infers common times of sediment deposition in two records and the uncertainty in these inferences in the form of confidence bands. HMM-Match models the sedimentation processes that led to proxy data measured in marine sediment cores. This Bayesian model has three components: 1) a generative probabilistic model that proceeds from the underlying geophysical and geochemical events, specifically the sedimentation events to the generation the proxy data Sedimentation ---> Proxy Data ; 2) a recursive algorithm that reverses the logic of the model to yield inference about the unobserved sedimentation events and the associated alignment of the records based on proxy data Proxy Data ---> Sedimentation (Alignment) ; 3) an expectation maximization algorithm for estimating two unknown parameters. We applied HMM-Match to align 35 Late Pleistocene records to a global benthic d18Ostack and found that the mean width of 95% confidence intervals varies between 3-23 kyr depending on the resolution and noisiness of the core's d18O signal. Confidence bands within individual cores also vary greatly, ranging from ~0 to >40 kyr. Results from this algorithm will allow researchers to examine the robustness of their conclusions with respect to alignment uncertainty. Figure 1 shows the confidence bands for one low resolution record.

  16. Cellular automaton for realistic modelling of landslides

    CERN Document Server

    Segrè, E; Enrico Segre; Chiara Deangeli

    1994-01-01

    We develop a numerical model for the simulation of debris flow in landslides over a complex three dimensional topography. The model is based on a lattice, in which debris can be transferred among nearest neighbours according to established empirical relationships for granular flows. The model is validated comparing its results with reported field data. Our model is in fact a realistic elaboration of simpler ``sandpile automata'', which have in recent years been studied as supposedly paradigmatic of ``self organized criticality''. Statistics and scaling properties of the simulation are examined, showing that the model has an intermittent behaviour.

  17. Replicate This! Creating Individual-Level Data from Summary Statistics Using R

    Science.gov (United States)

    Morse, Brendan J.

    2013-01-01

    Incorporating realistic data and research examples into quantitative (e.g., statistics and research methods) courses has been widely recommended for enhancing student engagement and comprehension. One way to achieve these ends is to use a data generator to emulate the data in published research articles. "MorseGen" is a free data generator that…

  18. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    Realists about are arrived at by any inferen- tial route which eschews causes (§3), and nor is there any direct pressure for Scientific Real- ists to change their inferential methods (§4). We suggest that in order to maintain inferential parity with Scientific Realism, proponents of EIA need to give......Enhanced Indispensability Arguments (EIA) claim that Scientific Realists are committed to the existence of mathematical entities due to their reliance on Inference to the Best Explana- tion (IBE). Our central question concerns this purported parity of reasoning: do people who defend the EIA make...... an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  19. Realistic Visualization of Virtual Views

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Computer Graphics allows us today to visualize in real-time innumerable and amazing scenarios with no limits on viewpoint and viewing direction. However, to design accurate object models and to simulate all the physical phenomena occurring in analogous real situations often represents a job...... that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...

  20. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated......: the augmented actor. Fundamental concepts and examples of methods proposed for realistic view synthesis based on the transfer of photo-realism from reference photographs to novel views, will be presented. The application of methods for realistic image synthesis to virtual cinematography will also be illustrated...

  1. Photo-Realistic Image Synthesis and Virtual Cinematography

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated......: the augmented actor. Fundamental concepts and examples of methods proposed for realistic view synthesis based on the transfer of photo-realism from reference photographs to novel views, will be presented. The application of methods for realistic image synthesis to virtual cinematography will also be illustrated...

  2. Predicting future wind power generation and power demand in France using statistical downscaling methods developed for hydropower applications

    Science.gov (United States)

    Najac, Julien

    2014-05-01

    For many applications in the energy sector, it is crucial to dispose of downscaling methods that enable to conserve space-time dependences at very fine spatial and temporal scales between variables affecting electricity production and consumption. For climate change impact studies, this is an extremely difficult task, particularly as reliable climate information is usually found at regional and monthly scales at best, although many industry oriented applications need further refined information (hydropower production model, wind energy production model, power demand model, power balance model…). Here we thus propose to investigate the question of how to predict and quantify the influence of climate change on climate-related energies and the energy demand. To do so, statistical downscaling methods originally developed for studying climate change impacts on hydrological cycles in France (and which have been used to compute hydropower production in France), have been applied for predicting wind power generation in France and an air temperature indicator commonly used for predicting power demand in France. We show that those methods provide satisfactory results over the recent past and apply this methodology to several climate model runs from the ENSEMBLES project.

  3. Quantum coding theory with realistic physical constraints

    CERN Document Server

    Yoshida, Beni

    2010-01-01

    The following open problems, which concern a fundamental limit on coding properties of quantum codes with realistic physical constraints, are analyzed and partially answered here: (a) the upper bound on code distances of quantum error-correcting codes with geometrically local generators, (b) the feasibility of a self-correcting quantum memory. To investigate these problems, we study stabilizer codes supported by local interaction terms with translation and scale symmetries on a $D$-dimensional lattice. Our analysis uses the notion of topology emerging in geometric shapes of logical operators, which sheds a surprising new light on theory of quantum codes with physical constraints.

  4. Time management: a realistic approach.

    Science.gov (United States)

    Jackson, Valerie P

    2009-06-01

    Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.

  5. Assessment and realistic mathematics education

    NARCIS (Netherlands)

    Heuvel-Panhuizen, M.H.A.M. van den

    1996-01-01

    This book describes the consequences of Realistic Mathematics Education (RME) for assessing students’ understanding of mathematics in primary school. RME is the Dutch answer to the worldwide need to reform mathematics education. Changed ideas about mathematics as a school subject, its goals, ideas a

  6. Probabilistic Bisimulation for Realistic Schedulers

    DEFF Research Database (Denmark)

    Eisentraut, Christian; Godskesen, Jens Christian; Hermanns, Holger

    2015-01-01

    . This holds in the classical context of arbitrary schedulers, but it has been argued that this class of schedulers is unrealistically powerful. This paper studies a strictly coarser notion of bisimilarity, which still enjoys these properties in the context of realistic subclasses of schedulers: Trace...

  7. Assessment and realistic mathematics education

    NARCIS (Netherlands)

    Heuvel-Panhuizen, M.H.A.M. van den

    1996-01-01

    This book describes the consequences of Realistic Mathematics Education (RME) for assessing students’ understanding of mathematics in primary school. RME is the Dutch answer to the worldwide need to reform mathematics education. Changed ideas about mathematics as a school subject, its goals,

  8. True-to-Life? Realistic Fiction.

    Science.gov (United States)

    Jordan, Anne Devereaux

    1995-01-01

    Suggests that modern realistic fiction for young readers is intensely moralistic and directive at the spoken and unspoken behest of the adults who write, select, and buy that literature. Discusses moral tales, early realistic fiction, modern realistic fiction, and choosing realistic fiction. (RS)

  9. True-to-Life? Realistic Fiction.

    Science.gov (United States)

    Jordan, Anne Devereaux

    1995-01-01

    Suggests that modern realistic fiction for young readers is intensely moralistic and directive at the spoken and unspoken behest of the adults who write, select, and buy that literature. Discusses moral tales, early realistic fiction, modern realistic fiction, and choosing realistic fiction. (RS)

  10. Realist Criminology and its Discontents

    Directory of Open Access Journals (Sweden)

    Simon Winlow

    2016-09-01

    Full Text Available Critical criminology must move beyond twentieth-century empiricist and idealist paradigms because the concepts and research programmes influenced by these paradigms are falling into obsolescence. Roger Matthews’ recent work firmly advocates this position and helps to set the ball rolling. Here we argue that Matthews’ attempt to use critical realist thought to move Left Realism towards an advanced position can help to put criminology on a sound new footing. However, before this becomes possible numerous philosophical and theoretical issues must be ironed out. Most importantly, critical criminology must avoid political pragmatism and adopt a more critical stance towards consumer culture’s spectacle. A searching analysis of these issues suggests that, ultimately, criminology is weighed down with obsolete thinking to such an extent that to remain intellectually relevant it must move beyond both Left Realism and Critical Realism to construct a new ultra-realist position.

  11. Assessment and realistic mathematics education

    OpenAIRE

    Heuvel-Panhuizen, M.H.A.M. van den

    1996-01-01

    This book describes the consequences of Realistic Mathematics Education (RME) for assessing students’ understanding of mathematics in primary school. RME is the Dutch answer to the worldwide need to reform mathematics education. Changed ideas about mathematics as a school subject, its goals, ideas about teaching and learning mathematics, require new forms of assessment. Within RME this means a preference for observation and individual interviews. However, written tests have not been abandoned...

  12. A Revelation: Quantum-Statistics and Classical-Statistics are Analytic-Geometry Conic-Sections and Numbers/Functions: Euler, Riemann, Bernoulli Generating-Functions: Conics to Numbers/Functions Deep Subtle Connections

    Science.gov (United States)

    Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!

  13. Realistic model for radiation-matter interaction

    CERN Document Server

    Pakula, R A

    2004-01-01

    This paper presents a realistic model that describes radiation-matter interactions. This is achieved by a generalization of first quantization, where the Maxwell equations are interpreted as the electromagnetic component of the Schrodinger equation. This picture is complemented by the consideration of electrons and photons as real particles in three-dimensional space, following guiding conditions derived from the particle-wave-functions to which they are associated. The guiding condition for the electron is taken from Bohmian mechanics, while the photon velocity is defined as the ratio between the Poynting vector and the electromagnetic energy density. The case of many particles is considered, taking into account their statistical properties. The formalism is applied to a two level system, providing an intuitive description for spontaneous emission, Lamb shift, scattering, absorption, dispersion, resonance fluorescence and vacuum fields. This model describes quantum jumps by the entanglement between the photo...

  14. Statistical properties of superimposed stationary spike trains.

    Science.gov (United States)

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  15. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    Science.gov (United States)

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  16. Analysis/plot generation code with significance levels computed using Kolmogorov-Smirnov statistics valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.

  17. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  18. Realistic real-time outdoor rendering in augmented reality.

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  19. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  20. Simulating realistic enough patient records.

    Science.gov (United States)

    Cunningham, James; Ainsworth, John

    2015-01-01

    Information systems for storing, managing and manipulating electronic medical records must place an emphasis on maintaining the privacy and security of those records. Though the design, development and testing of such systems also requires the use of data, the developers of these systems, rarely also their final end users, are unlikely to have ethical or governance approval to use real data. Alternative test data is commonly either randomly produced or taken from carefully anonymised subsets of records. In both cases there are potential shortcomings that can impact on the quality of the product being developed. We have addressed these shortcomings with a tool and methodology for efficiently simulating large amounts of realistic enough electronic patient records which can underpin the development of data-centric electronic healthcare systems.

  1. Realist RCTs of complex interventions - an oxymoron.

    Science.gov (United States)

    Marchal, Bruno; Westhorp, Gill; Wong, Geoff; Van Belle, Sara; Greenhalgh, Trisha; Kegels, Guy; Pawson, Ray

    2013-10-01

    Bonell et al. discuss the challenges of carrying out randomised controlled trials (RCTs) to evaluate complex interventions in public health, and consider the role of realist evaluation in enhancing this design (Bonell, Fletcher, Morton, Lorenc, & Moore, 2012). They argue for a "synergistic, rather than oppositional relationship between realist and randomised evaluation" and that "it is possible to benefit from the insights provided by realist evaluation without relinquishing the RCT as the best means of examining intervention causality." We present counter-arguments to their analysis of realist evaluation and their recommendations for realist RCTs. Bonell et al. are right to question whether and how (quasi-)experimental designs can be improved to better evaluate complex public health interventions. However, the paper does not explain how a research design that is fundamentally built upon a positivist ontological and epistemological position can be meaningfully adapted to allow it to be used from within a realist paradigm. The recommendations for "realist RCTs" do not sufficiently take into account important elements of complexity that pose major challenges for the RCT design. They also ignore key tenets of the realist evaluation approach. We propose that the adjective 'realist' should continue to be used only for studies based on a realist philosophy and whose analytic approach follows the established principles of realist analysis. It seems more correct to call the approach proposed by Bonell and colleagues 'theory informed RCT', which indeed can help in enhancing RCTs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Statistical method evaluation for differentially methylated CpGs in base resolution next-generation DNA sequencing data.

    Science.gov (United States)

    Zhang, Yun; Baheti, Saurabh; Sun, Zhifu

    2016-12-31

    High-throughput bisulfite methylation sequencing such as reduced representation bisulfite sequencing (RRBS), Agilent SureSelect Human Methyl-Seq (Methyl-seq) or whole-genome bisulfite sequencing is commonly used for base resolution methylome research. These data are represented either by the ratio of methylated cytosine versus total coverage at a CpG site or numbers of methylated and unmethylated cytosines. Multiple statistical methods can be used to detect differentially methylated CpGs (DMCs) between conditions, and these methods are often the base for the next step of differentially methylated region identification. The ratio data have a flexibility of fitting to many linear models, but the raw count data take consideration of coverage information. There is an array of options in each datatype for DMC detection; however, it is not clear which is an optimal statistical method. In this study, we systematically evaluated four statistic methods on methylation ratio data and four methods on count-based data and compared their performances with regard to type I error control, sensitivity and specificity of DMC detection and computational resource demands using real RRBS data along with simulation. Our results show that the ratio-based tests are generally more conservative (less sensitive) than the count-based tests. However, some count-based methods have high false-positive rates and should be avoided. The beta-binomial model gives a good balance between sensitivity and specificity and is preferred method. Selection of methods in different settings, signal versus noise and sample size estimation are also discussed. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Helping With All Your Heart: Realistic Heart Stimulus and Compliance With an Organ Donation Request.

    Science.gov (United States)

    Jacob, Céline; Guéguen, Nicolas

    2015-01-01

    Pictures and images are important aspects in fundraising advertising and could generate more donations. In two experimental studies, we examined the effect of various pictures of hearts on compliance with a request for organ donations. The solicitor wore a white tee shirt where various forms of hearts were printed: symbolic versus realistic (first experiment), none versus symbolic versus realistic (second experiment). Results showed that more compliance was found in the realistic heart experimental condition whereas the symbolic heart form had no significant effect.

  4. Performance Evaluation of Realistic Vanet Using Traffic Light Scenario

    CERN Document Server

    Nidhi,

    2012-01-01

    Vehicular Ad-hoc Networks (VANETs) is attracting considerable attention from the research community and the automotive industry to improve the services of Intelligent Transportation System (ITS). As today's transportation system faces serious challenges in terms of road safety, efficiency, and environmental friendliness, the idea of so called "ITS" has emerged. Due to the expensive cost of deployment and complexity of implementing such a system in real world, research in VANET relies on simulation. This paper attempts to evaluate the performance of VANET in a realistic environment. The paper contributes by generating a real world road Map of JNU using existing Google Earth and GIS tools. Traffic data from a limited region of road Map is collected to capture the realistic mobility. In this work, the entire region has been divided into various smaller routes. The realistic mobility model used here considers the driver's route choice at the run time. It also studies the clustering effect caused by traffic lights...

  5. Realistic modeling of neurons and networks: towards brain simulation.

    Science.gov (United States)

    D'Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    2013-01-01

    Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field.

  6. Statistical analysis of entropy generation in longitudinally finned tube heat exchanger with shell side nanofluid by a single phase approach

    Science.gov (United States)

    Konchada, Pavan Kumar; Pv, Vinay; Bhemuni, Varaprasad

    2016-06-01

    The presence of nanoparticles in heat exchangers ascertained increment in heat transfer. The present work focuses on heat transfer in a longitudinal finned tube heat exchanger. Experimentation is done on longitudinal finned tube heat exchanger with pure water as working fluid and the outcome is compared numerically using computational fluid dynamics (CFD) package based on finite volume method for different flow rates. Further 0.8% volume fraction of aluminum oxide (Al2O3) nanofluid is considered on shell side. The simulated nanofluid analysis has been carried out using single phase approach in CFD by updating the user-defined functions and expressions with thermophysical properties of the selected nanofluid. These results are thereafter compared against the results obtained for pure water as shell side fluid. Entropy generated due to heat transfer and fluid flow is calculated for the nanofluid. Analysis of entropy generation is carried out using the Taguchi technique. Analysis of variance (ANOVA) results show that the inlet temperature on shell side has more pronounced effect on entropy generation.

  7. Statistical analysis of entropy generation in longitudinally finned tube heat exchanger with shell side nanofluid by a single phase approach

    Directory of Open Access Journals (Sweden)

    Konchada Pavan Kumar

    2016-06-01

    Full Text Available The presence of nanoparticles in heat exchangers ascertained increment in heat transfer. The present work focuses on heat transfer in a longitudinal finned tube heat exchanger. Experimentation is done on longitudinal finned tube heat exchanger with pure water as working fluid and the outcome is compared numerically using computational fluid dynamics (CFD package based on finite volume method for different flow rates. Further 0.8% volume fraction of aluminum oxide (Al2O3 nanofluid is considered on shell side. The simulated nanofluid analysis has been carried out using single phase approach in CFD by updating the user-defined functions and expressions with thermophysical properties of the selected nanofluid. These results are thereafter compared against the results obtained for pure water as shell side fluid. Entropy generated due to heat transfer and fluid flow is calculated for the nanofluid. Analysis of entropy generation is carried out using the Taguchi technique. Analysis of variance (ANOVA results show that the inlet temperature on shell side has more pronounced effect on entropy generation.

  8. Generations.

    Science.gov (United States)

    Chambers, David W

    2005-01-01

    Groups naturally promote their strengths and prefer values and rules that give them an identity and an advantage. This shows up as generational tensions across cohorts who share common experiences, including common elders. Dramatic cultural events in America since 1925 can help create an understanding of the differing value structures of the Silents, the Boomers, Gen Xers, and the Millennials. Differences in how these generations see motivation and values, fundamental reality, relations with others, and work are presented, as are some applications of these differences to the dental profession.

  9. An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model

    Directory of Open Access Journals (Sweden)

    Cesare Frepoli

    2008-01-01

    Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

  10. Robust alignment of chromatograms by statistically analyzing the shifts matrix generated by moving window fast Fourier transform cross-correlation.

    Science.gov (United States)

    Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian

    2015-03-01

    Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2.

  11. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  12. Leadership research in healthcare: A realist review.

    Science.gov (United States)

    Lega, Federico; Prenestini, Anna; Rosso, Matilde

    2017-05-01

    Being largely considered a human right, healthcare needs leaders who are able to make choices and to set directions. Following the recommendations expressed by Gilmartin and D'Aunno's review and roadmap compiled in 2008, today, it is important to acknowledge researchers' contributions to outline this landscape. The realist review of 77 publications answered questions such as "what works, for whom, and in which circumstances" highlighting: the effectiveness and acceptance of transformational and collaborative approaches; professionalism, expertise, and good task delegation within operational teams; distributed leadership, relationships, and social responsibility at a systemic level. The relevancy and need of leadership development programs, framed within a wider strategy, emerged. Nonetheless, gaps still exist and require further investigation: particular needs in public vs. private contexts; professionals' and women's differentiating characters; generational gaps; associations between leadership and recruitment HR practices research; how (and if) leaders (should) influence the organizational culture and values; and developing countries specific challenges. Also, a greater proportion of relevant findings should be drawn by empirical and more rigorous studies. Finally, a major attention could be paid to interactions happening at the team, organizational, and systemic level among different leaders and among leaders, followers and external actors.

  13. Practical business statistics

    CERN Document Server

    Siegel, Andrew

    2011-01-01

    Practical Business Statistics, Sixth Edition, is a conceptual, realistic, and matter-of-fact approach to managerial statistics that carefully maintains-but does not overemphasize-mathematical correctness. The book offers a deep understanding of how to learn from data and how to deal with uncertainty while promoting the use of practical computer applications. This teaches present and future managers how to use and understand statistics without an overdose of technical detail, enabling them to better understand the concepts at hand and to interpret results. The text uses excellent examples with

  14. Stability of realistic strange stars (RSS)

    CERN Document Server

    Bhowmick, S; Dey, M; Ray, S; Ray, R; Bhowmick, Siddhartha; Dey, Jishnu; Dey, Mira; Ray, Subharthi; Ray, Ranjan

    2001-01-01

    Strange stars (SS) calculated from a realistic equation of state (EOS) are very stable, for example under fast rotation but have a soft surface, on which ripples may occur when radiation is emitted close to it. We suggest this as a natural explanation of the fluctuations observed in the intensity profile of X-ray pulsars. In contrast, SS based on EOS derived from the bag models (Bag SS) are less stable against fast rotation and do not have a hard surface and cannot explain these ripples. There are other important differences between Bag SS and the SS, based on a realistic EOS, which we call realistic strange stars (RSS).

  15. Simulating secondary organic aerosol in a regional air quality model using the statistical oxidation model – Part 1: Assessing the influence of constrained multi-generational ageing

    Directory of Open Access Journals (Sweden)

    S. H. Jathar

    2015-09-01

    Full Text Available Multi-generational oxidation of volatile organic compound (VOC oxidation products can significantly alter the mass, chemical composition and properties of secondary organic aerosol (SOA compared to calculations that consider only the first few generations of oxidation reactions. However, the most commonly used state-of-the-science schemes in 3-D regional or global models that account for multi-generational oxidation (1 consider only functionalization reactions but do not consider fragmentation reactions, (2 have not been constrained to experimental data; and (3 are added on top of existing parameterizations. The incomplete description of multi-generational oxidation in these models has the potential to bias source apportionment and control calculations for SOA. In this work, we used the Statistical Oxidation Model (SOM of Cappa and Wilson (2012, constrained by experimental laboratory chamber data, to evaluate the regional implications of multi-generational oxidation considering both functionalization and fragmentation reactions. SOM was implemented into the regional UCD/CIT air quality model and applied to air quality episodes in California and the eastern US. The mass, composition and properties of SOA predicted using SOM are compared to SOA predictions generated by a traditional "two-product" model to fully investigate the impact of explicit and self-consistent accounting of multi-generational oxidation. Results show that SOA mass concentrations predicted by the UCD/CIT-SOM model are very similar to those predicted by a two-product model when both models use parameters that are derived from the same chamber data. Since the two-product model does not explicitly resolve multi-generational oxidation reactions, this finding suggests that the chamber data used to parameterize the models captures the majority of the SOA mass formation from multi-generational oxidation under the conditions tested. Consequently, the use of low and high NOx yields

  16. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  17. Sotsialistlik realist Keskküla

    Index Scriptorium Estoniae

    1998-01-01

    Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid

  18. Realistic Approach for Phasor Measurement Unit Placement

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...

  19. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  20. Statistical pulses generator. Application to nuclear instrumentation (1962); Generateur d'impulsions aleatoires. Application a l'instrumentation nucleaire (1962)

    Energy Technology Data Exchange (ETDEWEB)

    Beranger, R. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1962-07-01

    This report describes a random pulses generator adapted to nuclear instrumentation. After a short survey on the statistical nature of electronic signals, the different ways for generating pulses with a Poisson's time-distribution are studied. The final generator built from a gaseous thyratron in a magnetic field is then described. Several tests are indicated : counting-rate stability, Pearson's criterion, distribution of time-intervals. Applications of the generator in 'whole testing' of nuclear instrumentation are then indicated for sealers, dead time measurements, time analyzers. In this application, pulse-height spectrums have been made by Poissonian sampling of a recurrent or random low-frequency signal. (author) [French] Cette etude decrit un generateur d'impulsions aleatoires et ses applications a l'instrumentation nucleaire. Apres un bref rappel sur la nature statistique des signaux en electronique nucleaire, sont passes en revue les principaux moyens d'obtenir des impulsions distribuees en temps suivant une loi de Poisson. Le generateur utilisant un thyratron a gaz dans un champ magnetique est ensuite decrit; diverses methodes de test sont appliquees (stabilite du taux de comptage, criterium de Pearson, spectre des intervalles ds temps). Les applications du generateur a l'electronique nucleaire dans le domaine des 'essais globaux' sont indiques: test des echelles de comptage et mesure des temps morts, test des analyseurs en temps apres division du taux de comptage par une puissance de deux, test des analyseurs multicanaux en amplitude. Pour cette derniere application, on a realise des spectres d'amplitudes suivant une loi connue, par echantillonnage poissonien d'un signal basse frequence recurrent ou aleatoire. (auteur)

  1. A Poisson Cluster Stochastic Rainfall Generator That Accounts for the Interannual Variability of Rainfall Statistics: Validation at Various Geographic Locations across the United States

    Directory of Open Access Journals (Sweden)

    Dongkyun Kim

    2014-01-01

    Full Text Available A novel approach for a Poisson cluster stochastic rainfall generator was validated in its ability to reproduce important rainfall and watershed response characteristics at 104 locations in the United States. The suggested novel approach, The Hybrid Model (THM, as compared to the traditional Poisson cluster rainfall modeling approaches, has an additional capability to account for the interannual variability of rainfall statistics. THM and a traditional approach of Poisson cluster rainfall model (modified Bartlett-Lewis rectangular pulse model were compared in their ability to reproduce the characteristics of extreme rainfall and watershed response variables such as runoff and peak flow. The results of the comparison indicate that THM generally outperforms the traditional approach in reproducing the distributions of peak rainfall, peak flow, and runoff volume. In addition, THM significantly outperformed the traditional approach in reproducing extreme rainfall by 2.3% to 66% and extreme flow values by 32% to 71%.

  2. Towards an agential realist thinking of learning

    DEFF Research Database (Denmark)

    Plauborg, Helle

    2014-01-01

    This paper explores what can be understood by learning based on agential realist thinking. An agential realist thinking about learning is sensitive to the complexity that characterizes learning as a phenomenon. Thus, learning, from an agential realist perspective, is a dynamic and emergent phenom...... or a radical change in the sense of 'penny dropped'. For although learning processes that are not recognized as 'aha moments' do not call much attention to themselves, this is how they occur most often.......This paper explores what can be understood by learning based on agential realist thinking. An agential realist thinking about learning is sensitive to the complexity that characterizes learning as a phenomenon. Thus, learning, from an agential realist perspective, is a dynamic and emergent...... human, which means that it is not only the human and the discursive possessing empowerment in relation to learning; the non-human also is woven into - and makes constitutive difference to - human learning processes. An excerpt from a field note will be used to illustrate these mutual shaping processes...

  3. Realist synthesis: illustrating the method for implementation research

    Directory of Open Access Journals (Sweden)

    Rycroft-Malone Jo

    2012-04-01

    Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as

  4. Evaluation of stochastic weather generators for capturing the statistics of extreme precipitation events in the Catskill Mountain watersheds, New York State

    Science.gov (United States)

    Acharya, N.; Frei, A.; Owens, E. M.; Chen, J.

    2015-12-01

    Watersheds located in the Catskill Mountains area, part of the eastern plateau climate region of New York, contributes about 90% of New York City's municipal water supply, serving 9 million New Yorkers with about 1.2 billion gallons of clean drinking water each day. The New York City Department of Environmental Protection has an ongoing series of studies to assess the potential impacts of climate change on the availability of high quality water in this water supply system. Recent studies identify increasing trends in total precipitation and in the frequency of extreme precipitation events in this region. The objectives of the present study are: to analyze the proba­bilistic structure of extreme precipitation based on historical observations: and to evaluate the abilities of stochastic weather generators (WG), statistical models that produce synthetic weather time series based on observed statistical properties at a particular location, to simulate the statistical properties of extreme precipitation events over this region. The generalized extreme value distribution (GEV) has been applied to the annual block maxima of precipitation for 60 years (1950 to 2009) observed data in order to estimate the events with return periods of 50, 75, and 100 years. These results were then used to evaluate a total of 13 WGs were : 12 parametric WGs including all combinations of three different orders of Markov chain (MC) models (1st , 2nd and 3rd) and four different probability distributions (exponential, gamma, skewed normal and mixed exponential); and one semi parametric WG based on k-nearest neighbor bootstrapping. Preliminary results suggest that three-parameter (skewed normal and mixed exponential distribution) and semi-parametric (k-nearest neighbor bootstrapping) WGs are more consistent with observations. It is also found that first order MC models perform as well as second or third order MC models.

  5. Statistical threshold determination method through noise map generation for two dimensional amplitude and time-of-flight mapping of guided waves

    Science.gov (United States)

    Yenn Chong, See; Lee, Jung-Ryul; Yik Park, Chan

    2013-03-01

    Conventional threshold crossing technique generally encounters the difficulty in setting a common threshold level in the extraction of the respective time-of-flights (ToFs) and amplitudes from the guided waves obtained at many different points by spatial scanning. Therefore, we propose a statistical threshold determination method through noise map generation to automatically process numerous guided waves having different propagation distances. First, a two-dimensional (2-D) noise map is generated using one-dimensional (1-D) WT magnitudes at time zero of the acquired waves. Then, the probability density functions (PDFs) of Gamma distribution, Weibull distribution and exponential distribution are used to model the measured 2-D noise map. Graphical goodness-of-fit measurements are used to find the best fit among the three theoretical distributions. Then, the threshold level is automatically determined by selecting the desired confidence level of the noise rejection in the cumulative distribution function of the best fit PDF. Based on this threshold level, the amplitudes and ToFs are extracted and mapped into a 2-D matrix array form. The threshold level determined by the noise statistics may cross the noise signal after time zero. These crossings are represented as salt-and-pepper noise in the ToF and amplitude maps but finally removed by the 1-D median filter. This proposed method was verified in a thick stainless steel hollow cylinder where guided waves were acquired in an area of 180 mm×126 mm of the cylinder by using a laser ultrasonic scanning system and an ultrasonic sensor. The Gamma distribution was estimated as the best fit to the verification experimental data by the proposed algorithm. The statistical parameters of the Gamma distribution were used to determine the threshold level appropriate for most of the guided waves. The ToFs and amplitudes of the first arrival mode were mapped into a 2-D matrix array form. Each map included 447 noisy points out of 90

  6. Realistic costs of carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Al Juaied, Mohammed (Harvard Univ., Cambridge, MA (US). Belfer Center for Science and International Affiaris); Whitmore, Adam (Hydrogen Energy International Ltd., Weybridge (GB))

    2009-07-01

    There is a growing interest in carbon capture and storage (CCS) as a means of reducing carbon dioxide (CO2) emissions. However there are substantial uncertainties about the costs of CCS. Costs for pre-combustion capture with compression (i.e. excluding costs of transport and storage and any revenue from EOR associated with storage) are examined in this discussion paper for First-of-a-Kind (FOAK) plant and for more mature technologies, or Nth-of-a-Kind plant (NOAK). For FOAK plant using solid fuels the levelised cost of electricity on a 2008 basis is approximately 10 cents/kWh higher with capture than for conventional plants (with a range of 8-12 cents/kWh). Costs of abatement are found typically to be approximately US$150/tCO2 avoided (with a range of US$120-180/tCO2 avoided). For NOAK plants the additional cost of electricity with capture is approximately 2-5 cents/kWh, with costs of the range of US$35-70/tCO2 avoided. Costs of abatement with carbon capture for other fuels and technologies are also estimated for NOAK plants. The costs of abatement are calculated with reference to conventional SCPC plant for both emissions and costs of electricity. Estimates for both FOAK and NOAK are mainly based on cost data from 2008, which was at the end of a period of sustained escalation in the costs of power generation plant and other large capital projects. There are now indications of costs falling from these levels. This may reduce the costs of abatement and costs presented here may be 'peak of the market' estimates. If general cost levels return, for example, to those prevailing in 2005 to 2006 (by which time significant cost escalation had already occurred from previous levels), then costs of capture and compression for FOAK plants are expected to be US$110/tCO2 avoided (with a range of US$90-135/tCO2 avoided). For NOAK plants costs are expected to be US$25-50/tCO2. Based on these considerations a likely representative range of costs of abatement from CCS

  7. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    Science.gov (United States)

    Koppert, M. M. J.; Kalitzin, S.; Lopes da Silva, F. H.; Viergever, M. A.

    2011-08-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This scenario predicts exponential distributions of the duration of the seizures and of the inter-ictal intervals. These predictions were validated in rat models of absence epilepsy, as well as in a few human cases. Nonetheless, deviations from the predictions with respect to seizure duration distributions remained unexplained. The objective of the present work is to implement a simple but realistic computational model of a neuronal network including synaptic plasticity and ionic current dynamics and to explore the dynamics of the model with special emphasis on the distributions of seizure and inter-ictal period durations. We use as a basis our lumped model of cortical neuronal circuits. Here we introduce 'activity dependent' parameters, namely post-synaptic voltage-dependent plasticity, as well as a voltage-dependent hyperpolarization-activated current driven by slow and fast activation conductances. We examine the distributions of the durations of the seizure-like model activity and the normal activity, described respectively by the limit cycle and the steady state in the dynamics. We use a parametric γ-distribution fit as a quantifier. Our results show that autonomous, activity-dependent membrane processes can account for experimentally obtained statistical distributions of seizure durations, which were not explainable using the previous model. The activity-dependent membrane processes that display the strongest effect in accounting for these distributions are the hyperpolarization-dependent cationic (Ih) current and the GABAa plastic dynamics. Plastic synapses (NMDA-type) in the interneuron population show only a minor effect. The inter-ictal statistics retain their

  8. Realistic roofs over a rectilinear polygon

    KAUST Repository

    Ahn, Heekap

    2013-11-01

    Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.

  9. Development of a realistic human airway model.

    Science.gov (United States)

    Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav

    2012-03-01

    Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.

  10. Using a weather generator to downscale spatio-temporal precipitation at urban scale

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Christensen, Ole Bøssing; Arnbjerg-Nielsen, Karsten

    /ECHAM and HIRHAM/ECHAM, A1B scenario and 25 km spatial scale) and two models run just for southern Scandinavia (both HIRHAM/EC-EARTH, rcp 4.5 and rcp 8.5 scenarios and 8 km spatial scale). All datasets are at one-hour time resolution. All models result in marked different perturbation schemes for the weather...... realistic extreme statistics. The model satisfactorily reproduces extreme statistics down to the one-hour scale and further produces realistic spatial correlation patterns at the rain event level. This is also the case for the extreme events. Furthermore, the weather generator is able to reproduce...

  11. Injury Statistics

    Science.gov (United States)

    ... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...

  12. Role of dissipation in realistic Majorana nanowires

    Science.gov (United States)

    Liu, Chun-Xiao; Sau, Jay D.; Das Sarma, S.

    2017-02-01

    We carry out a realistic simulation of Majorana nanowires in order to understand the latest high-quality experimental data [H. Zhang et al., arXiv:1603.04069 (2016)] and, in the process, develop a comprehensive picture for what physical mechanisms may be operational in realistic nanowires leading to discrepancies between minimal theory and experimental observations (e.g., weakness and broadening of the zero-bias peak and breaking of particle-hole symmetry). Our focus is on understanding specific intriguing features in the data, and our goal is to establish matters of principle controlling the physics of the best possible nanowires available in current experiments. We identify dissipation, finite temperature, multi-sub-band effects, and the finite tunnel barrier as the four most important physical mechanisms controlling the zero-bias conductance peak. Our theoretical results including these realistic effects agree well with the best available experimental data in ballistic nanowires.

  13. The influence of control group reproduction on the statistical power of the Environmental Protection Agency's Medaka Extended One Generation Reproduction Test (MEOGRT).

    Science.gov (United States)

    Flynn, Kevin; Swintek, Joe; Johnson, Rodney

    2017-02-01

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group.

  14. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  15. Neo-realistic Features in Snow Child

    Institute of Scientific and Technical Information of China (English)

    夏静林

    2013-01-01

    This essay illustrates neo-realistic features in Snow Child. This short story inherits and develops“the typical character in the typical environment”and the traditional linear time order in realistic works. At the same time, it has some changes and transcendent. It uses real brands, events and goods in reality to create a-true to-life picture of the contemporary world. It ex⁃plores deeper in character’s psychology and creates a circular narration with three flashbacks.

  16. Keeping It Real: How Realistic Does Realistic Fiction for Children Need to Be?

    Science.gov (United States)

    O'Connor, Barbara

    2010-01-01

    O'Connor, an author of realistic fiction for children, shares her attempts to strike a balance between carefree, uncensored, authentic, realistic writing and age-appropriate writing. Of course, complicating that balancing act is the fact that what seems age-appropriate to her might not seem so to everyone. O'Connor suggests that while it may be…

  17. Keeping It Real: How Realistic Does Realistic Fiction for Children Need to Be?

    Science.gov (United States)

    O'Connor, Barbara

    2010-01-01

    O'Connor, an author of realistic fiction for children, shares her attempts to strike a balance between carefree, uncensored, authentic, realistic writing and age-appropriate writing. Of course, complicating that balancing act is the fact that what seems age-appropriate to her might not seem so to everyone. O'Connor suggests that while it may be…

  18. Blend Shape Interpolation and FACS for Realistic Avatar

    Science.gov (United States)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  19. A Look at Young Children's Realistic Fiction.

    Science.gov (United States)

    Madsen, Jane M.; Wickersham, Elaine B.

    1980-01-01

    Analyzes recent realistic fiction for children produced in the United States in terms of ethnicity, stereotyped behavior, and themes. Concludes that the sample did not reflect equivalent treatment of males and females nor the culturally pluralistic makeup of U.S. society. Provides an annotated bibliography of the books analyzed. (Author/FL)

  20. The romantic paradigm of realist Dostoyevsky

    OpenAIRE

    LIPICH V.V.; LIPICH T.I.

    2015-01-01

    The article is devoted to the inheritance of the traditions of romanticism in the creative activity of young F. M. Dostoyevsky. The article dwells upon the certain points of integration and the typological interrelation of the artistic views of F. M. Dostoyevsky a writer-realist with the esthetic system and poetics of romanticism.

  1. Satellite Maps Deliver More Realistic Gaming

    Science.gov (United States)

    2013-01-01

    When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.

  2. A Look at Young Children's Realistic Fiction.

    Science.gov (United States)

    Madsen, Jane M.; Wickersham, Elaine B.

    1980-01-01

    Analyzes recent realistic fiction for children produced in the United States in terms of ethnicity, stereotyped behavior, and themes. Concludes that the sample did not reflect equivalent treatment of males and females nor the culturally pluralistic makeup of U.S. society. Provides an annotated bibliography of the books analyzed. (Author/FL)

  3. Improving Intuition Skills with Realistic Mathematics Education

    Science.gov (United States)

    Hirza, Bonita; Kusumah, Yaya S.; Darhim; Zulkardi

    2014-01-01

    The intention of the present study was to see the improvement of students' intuitive skills. This improvement was seen by comparing the Realistic Mathematics Education (RME)-based instruction with the conventional mathematics instruction. The subject of this study was 164 fifth graders of elementary school in Palembang. The design of this study…

  4. Culture from the Perspective of Realistic Philosophy

    Directory of Open Access Journals (Sweden)

    Wojciech Daszkiewicz

    2015-12-01

    Full Text Available The article underlines the moments that define the metaphysical understanding of culture. According to this conception, culture in its most basic meaning is rationalization (intellectualization of nature. The article is focused on the following areas: genetic-exemplarist analysis of cultural works and definition of culture from the perspective of realistic philosophy.

  5. From grid cells to place cells with realistic field sizes.

    Science.gov (United States)

    Neher, Torsten; Azizi, Amir Hossein; Cheng, Sen

    2017-01-01

    While grid cells in the medial entorhinal cortex (MEC) of rodents have multiple, regularly arranged firing fields, place cells in the cornu ammonis (CA) regions of the hippocampus mostly have single spatial firing fields. Since there are extensive projections from MEC to the CA regions, many models have suggested that a feedforward network can transform grid cell firing into robust place cell firing. However, these models generate place fields that are consistently too small compared to those recorded in experiments. Here, we argue that it is implausible that grid cell activity alone can be transformed into place cells with robust place fields of realistic size in a feedforward network. We propose two solutions to this problem. Firstly, weakly spatially modulated cells, which are abundant throughout EC, provide input to downstream place cells along with grid cells. This simple model reproduces many place cell characteristics as well as results from lesion studies. Secondly, the recurrent connections between place cells in the CA3 network generate robust and realistic place fields. Both mechanisms could work in parallel in the hippocampal formation and this redundancy might account for the robustness of place cell responses to a range of disruptions of the hippocampal circuitry.

  6. Spectral tunability of realistic plasmonic nanoantennas

    Energy Technology Data Exchange (ETDEWEB)

    Portela, Alejandro; Matsui, Hiroaki; Tabata, Hitoshi, E-mail: tabata@bioeng.t.u-tokyo.ac.jp [Department of Bioengineering, School of Engineering, The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan); Yano, Takaaki; Hayashi, Tomohiro; Hara, Masahiko [Department of Electronic Chemistry, Tokyo Institute of Technology, Midori-ku, Yokohama, Kanagawa 226-8502 (Japan); Santschi, Christian; Martin, Olivier J. F. [Nanophotonics and Metrology Laboratory, Swiss Federal Institute of Technology Lausanne, Lausanne CH-1015 (Switzerland)

    2014-09-01

    Single nanoantenna spectroscopy was carried out on realistic dipole nanoantennas with various arm lengths and gap sizes fabricated by electron-beam lithography. A significant difference in resonance wavelength between realistic and ideal nanoantennas was found by comparing their spectral response. Consequently, the spectral tunability (96 nm) of the structures was significantly lower than that of simulated ideal nanoantennas. These observations, attributed to the nanofabrication process, are related to imperfections in the geometry, added metal adhesion layer, and shape modifications, which are analyzed in this work. Our results provide important information for the design of dipole nanoantennas clarifying the role of the structural modifications on the resonance spectra, as supported by calculations.

  7. Presupernova neutrinos: realistic emissivities from stellar evolution

    CERN Document Server

    Patton, Kelly M

    2015-01-01

    We present a new calculation of neutrino emissivities and energy spectra from a presupernova, a massive star going through the advanced stages of nuclear burning in the months before becoming a supernova. The contributions from beta decay and electron capture, pair annihilation, plasmon decay, and the photoneutrino process are modeled in detail, using updated tabulated nuclear rates. We also use realistic conditions of temperature, density, electron fraction and nuclear isotopic composition of the star from the state of the art stellar evolution code MESA. It is found that beta processes contribute substantially to the neutrino flux above realistic detection thresholds of few MeV, at selected positions and times in the evolution of the star.

  8. Realistic molecular model of kerogen's nanostructure

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E.; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J.-M.; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp2/sp3 hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  9. Realistic molecular model of kerogen's nanostructure.

    Science.gov (United States)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  10. Realistic Simulation for Body Area and Body-To-Body Networks.

    Science.gov (United States)

    Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele

    2016-04-20

    In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.

  11. Realistic level density calculation for heavy nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Cerf, N. [Institut de Physique Nucleaire, Orsay (France); Pichon, B. [Observatoire de Paris, Meudon (France); Rayet, M.; Arnould, M. [Institut d`Astronomie et d`Astrophysique, Bruxelles (Belgium)

    1994-12-31

    A microscopic calculation of the level density is performed, based on a combinatorial evaluation using a realistic single-particle level scheme. This calculation relies on a fast Monte Carlo algorithm, allowing to consider heavy nuclei (i.e., large shell model spaces) which could not be treated previously in combinatorial approaches. An exhaustive comparison of the predicted neutron s-wave resonance spacings with experimental data for a wide range of nuclei is presented.

  12. Humanoid robot simulator: a realistic dynamics approach

    OpenAIRE

    Lima, José; Gonçalves, José; Costa, Paulo; Moreira, António

    2008-01-01

    This paper describes a humanoid robot simulator with realistic dynamics. As simulation is a powerful tool for speeding up the control software development, the suggested accurate simulator allows to accomplish this goal. The simulator, based on the Open Dynamics Engine and GLScene graphics library, provides instant visual feedback and allows the user to test any control strategy without damaging the real robot in the early stages of the development. The proposed simulator also captures some c...

  13. Realistic Behaviour Simulation of a Humanoid Robot

    OpenAIRE

    2008-01-01

    This paper describes a humanoid robot simulator with realistic dynamics. As simulation is a powerful tool for speeding up the control software development, the proposed accurate simulator allows to fulfil this goal. The simulator is based on the Open Dynamics Engine and GLScene graphics library, providing instant visual feedback. User is able to test any control strategy without bringing damage to the real robot in the early stages of the development. The proposed simulator also captures some...

  14. Algorithm for Realistic Modeling of Graphitic Systems

    Directory of Open Access Journals (Sweden)

    A.V. Khomenko

    2011-01-01

    Full Text Available An algorithm for molecular dynamics simulations of graphitic systems using realistic semiempirical interaction potentials of carbon atoms taking into account both short-range and long-range contributions is proposed. Results of the use of the algorithm for a graphite sample are presented. The scalability of the algorithm depending on the system size and the number of processor cores involved in the calculations is analyzed.

  15. Dynamical Symmetries Reflected in Realistic Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Sviratcheva, K.D.; Draayer, J.P.; /Louisiana State U.; Vary, J.P.; /Iowa State U. /LLNL, Livermore /SLAC

    2007-04-06

    Realistic nucleon-nucleon (NN) interactions, derived within the framework of meson theory or more recently in terms of chiral effective field theory, yield new possibilities for achieving a unified microscopic description of atomic nuclei. Based on spectral distribution methods, a comparison of these interactions to a most general Sp(4) dynamically symmetric interaction, which previously we found to reproduce well that part of the interaction that is responsible for shaping pairing-governed isobaric analog 0{sup +} states, can determine the extent to which this significantly simpler model Hamiltonian can be used to obtain an approximate, yet very good description of low-lying nuclear structure. And furthermore, one can apply this model in situations that would otherwise be prohibitive because of the size of the model space. In addition, we introduce a Sp(4) symmetry breaking term by including the quadrupole-quadrupole interaction in the analysis and examining the capacity of this extended model interaction to imitate realistic interactions. This provides a further step towards gaining a better understanding of the underlying foundation of realistic interactions and their ability to reproduce striking features of nuclei such as strong pairing correlations or collective rotational motion.

  16. Realistically Rendering SoC Traffic Patterns with Interrupt Awareness

    DEFF Research Database (Denmark)

    Angiolini, Frederico; Mahadevan, Sharkar; Madsen, Jan;

    2005-01-01

    traffic generators have been used to overcome such an issue. However, target applications increasingly present non-trivial execution flows and synchronization patterns, especially in presence of underlying operating systems and when exploiting interrupt facilities. This property makes it very difficult...... to generate realistic test traffic. This paper presents a selection of applications using interrupt-based synchronization; a reference methodology to split such applications in execution subflows and to adjust the overall execution stream based upon hardware events; a reactive simulation device capable......In Multi-Processor System-on-Chip (MPSoC) design stages, accurate modeling of IP behaviour is crucial to analyze interconnect effectiveness. However, parallel development of components may cause IP core models to be still unavailable when tuning communication performance. Traditionally, synthetic...

  17. Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books

    Science.gov (United States)

    Kelley, Jane E.; Darragh, Janine J.

    2011-01-01

    Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…

  18. Dynamic Simulations of Realistic Upper-Ocean Flow Processes to Support Measurement and Data Analysis

    Science.gov (United States)

    2015-09-30

    shear turbulence . The more uniform mean velocity profile is related to the strong mixing effect of the Langmuir circulations. This mixing effect...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Dynamic Simulations of Realistic Upper-Ocean Flow ...and evolution of Langmuir cells in simulations ; quantify the statistics of Langmuir turbulence . • Investigate the onset of wave breaking; quantify

  19. A Patent Analysis on Realistic Media for R&D Projects

    Science.gov (United States)

    Hwang, Sung-Hyun; Yeon, Seung-Jun

    In this paper, we use patent statistics as the recent status of R&D in international patents. The number of patents was used to compare different countries' share in the overall number of technology patents. Using the degree of patent citations, the influence and technological prowess of patents were examined. Also, implications were drawn from an analysis of patent contents for realistic media.

  20. Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books

    Science.gov (United States)

    Kelley, Jane E.; Darragh, Janine J.

    2011-01-01

    Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…

  1. Electron percolation in realistic models of carbon nanotube networks

    Science.gov (United States)

    Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain

    2015-09-01

    The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.

  2. Prediction of permeability for porous media reconstructed using multiple-point statistics.

    Science.gov (United States)

    Okabe, Hiroshi; Blunt, Martin J

    2004-12-01

    To predict multiphase flow through geologically realistic porous media, it is necessary to have a three-dimensional (3D) representation of the pore space. We use multiple-point statistics based on two-dimensional (2D) thin sections as training images to generate geologically realistic 3D pore-space representations. Thin-section images can provide multiple-point statistics, which describe the statistical relation between multiple spatial locations and use the probability of occurrence of particular patterns. Assuming that the medium is isotropic, a 3D image can be generated that preserves typical patterns of the void space seen in the thin sections. The method is tested on Berea sandstone for which a 3D image from micro-CT (Computerized Tomography) scanning is available and shows that the use of multiple-point statistics allows the long-range connectivity of the structure to be preserved, in contrast to two-point statistics methods that tend to underestimate the connectivity. Furthermore, a high-resolution 2D thin-section image of a carbonate reservoir rock is used to reconstruct 3D structures by the proposed method. The permeabilities of the statistical images are computed using the lattice-Boltzmann method (LBM). The results are similar to the measured values, to the permeability directly computed on the micro-CT image for Berea and to predictions using analysis of the 2D images and the effective medium approximation.

  3. Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures

    Science.gov (United States)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  4. Algebraic Statistics

    OpenAIRE

    Norén, Patrik

    2013-01-01

    Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

  5. Adiabatic hyperspherical analysis of realistic nuclear potentials

    CERN Document Server

    Daily, K M; Greene, Chris H

    2015-01-01

    Using the hyperspherical adiabatic method with the realistic nuclear potentials Argonne V14, Argonne V18, and Argonne V18 with the Urbana IX three-body potential, we calculate the adiabatic potentials and the triton bound state energies. We find that a discrete variable representation with the slow variable discretization method along the hyperradial degree of freedom results in energies consistent with the literature. However, using a Laguerre basis results in missing energy, even when extrapolated to an infinite number of basis functions and channels. We do not include the isospin $T=3/2$ contribution in our analysis.

  6. Realistic searches on stretched exponential networks

    Indian Academy of Sciences (India)

    Parogama Sen

    2008-08-01

    We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed. An estimate of = ρ/d, where is the success rate and d the dynamic path length, shows that for a network of nodes, ∝ - in general. Dynamic small world effect, i.e., ≃ 0 is shown to exist in a restricted region of the - plane.

  7. An Introduction to a Realistic Quantum Physics

    CERN Document Server

    Preparata, Giuliano

    2003-01-01

    This book is a remarkable synthesis, a clear and simple introduction to Quantum Physics with a sort of Galilean dialogue on the supreme systems of contemporary Physics. The author, whose research interests and work extended from quarks to liquid systems and from crystals to stars, introduces the common conceptual and mathematical framework of all quantum theories, realistic enough to successfully confront Nature: Quantum Field Theory applied to the study of both dilute and condensed matter. In the dilute limit, quantum mechanics is shown to be a good approximation to Quantum Field Theory. Howe

  8. Multifractal network generator.

    Science.gov (United States)

    Palla, Gergely; Lovász, László; Vicsek, Tamás

    2010-04-27

    We introduce a new approach to constructing networks with realistic features. Our method, in spite of its conceptual simplicity (it has only two parameters) is capable of generating a wide variety of network types with prescribed statistical properties, e.g., with degree or clustering coefficient distributions of various, very different forms. In turn, these graphs can be used to test hypotheses or as models of actual data. The method is based on a mapping between suitably chosen singular measures defined on the unit square and sparse infinite networks. Such a mapping has the great potential of allowing for graph theoretical results for a variety of network topologies. The main idea of our approach is to go to the infinite limit of the singular measure and the size of the corresponding graph simultaneously. A very unique feature of this construction is that with the increasing system size the generated graphs become topologically more structured. We present analytic expressions derived from the parameters of the--to be iterated--initial generating measure for such major characteristics of graphs as their degree, clustering coefficient, and assortativity coefficient distributions. The optimal parameters of the generating measure are determined from a simple simulated annealing process. Thus, the present work provides a tool for researchers from a variety of fields (such as biology, computer science, biology, or complex systems) enabling them to create a versatile model of their network data.

  9. Modeling of Transmembrane Potential in Realistic Multicellular Structures before Electroporation.

    Science.gov (United States)

    Murovec, Tomo; Sweeney, Daniel C; Latouche, Eduardo; Davalos, Rafael V; Brosseau, Christian

    2016-11-15

    Many approaches for studying the transmembrane potential (TMP) induced during the treatment of biological cells with pulsed electric fields have been reported. From the simple analytical models to more complex numerical models requiring significant computational resources, a gamut of methods have been used to recapitulate multicellular environments in silico. Cells have been modeled as simple shapes in two dimensions as well as more complex geometries attempting to replicate realistic cell shapes. In this study, we describe a method for extracting realistic cell morphologies from fluorescence microscopy images to generate the piecewise continuous mesh used to develop a finite element model in two dimensions. The preelectroporation TMP induced in tightly packed cells is analyzed for two sets of pulse parameters inspired by clinical irreversible electroporation treatments. We show that high-frequency bipolar pulse trains are better, and more homogeneously raise the TMP of tightly packed cells to a simulated electroporation threshold than conventional irreversible electroporation pulse trains, at the expense of larger applied potentials. Our results demonstrate the viability of our method and emphasize the importance of considering multicellular effects in the numerical models used for studying the response of biological tissues exposed to electric fields. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. Towards Modeling Realistic Mobility for Performance Evaluations in MANET

    Science.gov (United States)

    Aravind, Alex; Tahir, Hassan

    Simulation modeling plays crucial role in conducting research on complex dynamic systems like mobile ad hoc networks and often the only way. Simulation has been successfully applied in MANET for more than two decades. In several recent studies, it is observed that the credibility of the simulation results in the field has decreased while the use of simulation has steadily increased. Part of this credibility crisis has been attributed to the simulation of mobility of the nodes in the system. Mobility has such a fundamental influence on the behavior and performance of mobile ad hoc networks. Accurate modeling and knowledge of mobility of the nodes in the system is not only helpful but also essential for the understanding and interpretation of the performance of the system under study. Several ideas, mostly in isolation, have been proposed in the literature to infuse realism in the mobility of nodes. In this paper, we attempt a holistic analysis of creating realistic mobility models and then demonstrate creation and analysis of realistic mobility models using a software tool we have developed. Using our software tool, desired mobility of the nodes in the system can be specified, generated, analyzed, and then the trace can be exported to be used in the performance studies of proposed algorithms or systems.

  11. Toward the classification of the realistic free fermionic models

    Energy Technology Data Exchange (ETDEWEB)

    Faraggi, A.E.

    1997-08-01

    The realistic free fermionic models have had remarkable success in providing plausible explanations for various properties of the Standard Model which include the natural appearance of three generations, the explanation of the heavy top quark mass and the qualitative structure of the fermion mass spectrum in general, the stability of the proton and more. These intriguing achievements makes evident the need to understand the general space of these models. While the number of possibilities is large, general patterns can be extracted. In this paper the author presents a detailed discussion on the construction of the realistic free fermionic models with the aim of providing some insight into the basic structures and building blocks that enter the construction. The role of free phases in the determination of the phenomenology of the models is discussed in detail. The author discusses the connection between the free phases and mirror symmetry in (2,2) models and the corresponding symmetries in the case of (2,0) models. The importance of the free phases in determining the effective low energy phenomenology is illustrated in several examples. The classification of the models in terms of boundary condition selection rules, real world-sheet fermion pairings, exotic matter states and the hidden sector is discussed.

  12. UAV based distributed ATR under realistic simulated environmental effects

    Science.gov (United States)

    Chen, Xiaohan; Gong, Shanshan; Schmid, Natalia A.; Valenti, Matthew C.

    2007-04-01

    Over the past several years, the military has grown increasingly reliant upon the use of unattended aerial vehicles (UAVs) for surveillance missions. There is an increasing trend towards fielding swarms of UAVs operating as large-scale sensor networks in the air. Such systems tend to be used primarily for the purpose of acquiring sensory data with the goal of automatic detection, identification, and tracking objects of interest. These trends have been paralleled by advances in both distributed detection, image/signal processing and data fusion techniques. Furthermore, swarmed UAV systems must operate under severe constraints on environmental conditions and sensor limitations. In this work, we investigate the effects of environmental conditions on target detection and recognition performance in a UAV network. We assume that each UAV is equipped with an optical camera, and use a realistic computer simulation to generate synthetic images. The detection algorithm relies on Haar-based features while the automatic target recognition (ATR) algorithm relies on Bessel K features. The performance of both algorithms is evaluated using simulated images that closely mimic data acquired in a UAV network under realistic environmental conditions. We design several fusion techniques and analyze both the case of a single observation and the case of multiple observations of the same target.

  13. Shadow obstacle model for realistic corner-turning behavior in crowd simulation

    Institute of Scientific and Technical Information of China (English)

    Gao-qi HE; Yi JIN; Qi CHEN; Zhen LIU; Wen-hui YUE; Xing-jian LU

    2016-01-01

    This paper describes a novel model known as the shadow obstacle model to generate a realistic corner-turning be-havior in crowd simulation. The motivation for this model comes from the observation that people tend to choose a safer route rather than a shorter one when turning a corner. To calculate a safer route, an optimization method is proposed to generate the corner-turning rule that maximizes the viewing range for the agents. By combining psychological and physical forces together, a full crowd simulation framework is established to provide a more realistic crowd simulation. We demonstrate that our model produces a more realistic corner-turning behavior by comparison with real data obtained from the experiments. Finally, we per-form parameter analysis to show the believability of our model through a series of experiments.

  14. Realistic affective forecasting: The role of personality.

    Science.gov (United States)

    Hoerger, Michael; Chapman, Ben; Duberstein, Paul

    2016-11-01

    Affective forecasting often drives decision-making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the "realistic paradigm" in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesised that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine's Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesised, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts.

  15. Realistic Radio Communications in Pilot Simulator Training

    Science.gov (United States)

    Burki-Cohen, Judith; Kendra, Andrew J.; Kanki, Barbara G.; Lee, Alfred T.

    2000-01-01

    Simulators used for total training and evaluation of airline pilots must satisfy stringent criteria in order to assure their adequacy for training and checking maneuvers. Air traffic control and company radio communications simulation, however, may still be left to role-play by the already taxed instructor/evaluators in spite of their central importance in every aspect of the flight environment. The underlying premise of this research is that providing a realistic radio communications environment would increase safety by enhancing pilot training and evaluation. This report summarizes the first-year efforts of assessing the requirement and feasibility of simulating radio communications automatically. A review of the training and crew resource/task management literature showed both practical and theoretical support for the need for realistic radio communications simulation. A survey of 29 instructor/evaluators from 14 airlines revealed that radio communications are mainly role-played by the instructor/evaluators. This increases instructor/evaluators' own workload while unrealistically lowering pilot communications load compared to actual operations, with a concomitant loss in training/evaluation effectiveness. A technology review searching for an automated means of providing radio communications to and from aircraft with minimal human effort showed that while promising, the technology is still immature. Further research and the need for establishing a proof-of-concept are also discussed.

  16. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  17. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  18. Dynamics of entanglement in realistic chains of superconducting qubits

    CERN Document Server

    Tsomokos, D I; Huelga, S F; Plenio, M B

    2006-01-01

    The quantum dynamics of chains of superconducting qubits is analyzed under realistic experimental conditions. Electromagnetic fluctuations due to the background circuitry, finite temperature in the external environment, and disorder in the initial preparation and the control parameters are taken into account. It is shown that the amount of disorder that is typically present in current experiments does not affect the entanglement dynamics significantly. However, the effect of the environmental noise can modify entanglement generation and propagation across the chain. We study the persistence of coherent effects in the presence of noise and possible ways to efficiently detect the presence of quantum entanglement. We also discuss under which circumstances the system exhibits steady state entanglement for both short (N30) chains and show that there are parameter regimes where the steady state entanglement is strictly non-monotonic as a function of the noise strength. We present optimized schemes for entanglement ...

  19. Unsteady velocity measurements in a realistic intracranial aneurysm model

    Science.gov (United States)

    Ugron, Ádám; Farinas, Marie-Isabelle; Kiss, László; Paál, György

    2012-01-01

    The initiation, growth and rupture of intracranial aneurysms are intensively studied by computational fluid dynamics. To gain confidence in the results of numerical simulations, validation of the results is necessary. To this end the unsteady flow was measured in a silicone phantom of a realistic intracranial aneurysm. A flow circuit was built with a novel unsteady flow rate generating method, used to model the idealised shape of the heartbeat. This allowed the measurement of the complex three-dimensional velocity distribution by means of laser-optical methods such as laser doppler anemometry (LDA) and particle image velocimetry (PIV). The PIV measurements, available with high temporal and spatial distribution, were found to have good agreement with the control LDA measurements. Furthermore, excellent agreement was found with the numerical results.

  20. Towards realistic string vacua from branes at singularities

    CERN Document Server

    Conlon, Joseph P; Quevedo, Fernando

    2009-01-01

    We report on progress towards constructing string models incorporating both realistic D-brane matter content and full moduli stabilisation with dynamical low-scale supersymmetry breaking. The general framework is that of local D-brane models embedded into the LARGE volume approach to moduli stabilisation. We review quiver theories on del Pezzo n (dP_n) singularities including both D3 and D7 branes. We provide supersymmetric examples with three quark/lepton families and the gauge symmetries of the Standard, Left-Right Symmetric, Pati-Salam and Trinification models, without unwanted chiral exotics. We describe how the singularity structure leads to family symmetries governing the Yukawa couplings which can give mass hierarchies among the different generations. We outline how these models can be embedded into compact Calabi-Yau compactifications with LARGE volume moduli stabilisation, and state the minimal conditions for this to be possible. We study the general structure of soft supersymmetry breaking. At the s...

  1. On the use of statistics and kriging to the monitoring of hydro generating units at Hydro-Quebec; Application de la statistique et du krigeage a la surveillance de groupes hydroelectriques a Hydro-Quebec

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, F.

    1996-12-31

    Optimizing the preventing monitoring of a water current power generator requires a good statistical representation. But the unit working rate is always one of the three most interesting ones, which results in poor statistical data for any other working rate. Kriging has been used, in a simplified and efficient way that takes into account for each state the preceding state and the following one (dual kriging). In addition, a novel approach for computing the standard deviation of a distribution is described, as well as a method for estimating the bias on the interpolation used to calculate the latter`s range of validity. (D.L.) 32 refs.

  2. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    NARCIS (Netherlands)

    Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.

    2011-01-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This scena

  3. Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models

    NARCIS (Netherlands)

    Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.

    2011-01-01

    In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This

  4. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  5. Trends in Alaska temperature data. Towards a more realistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-de-Lacalle, Javier [University of the Basque Country, Department of Applied Economics III (Econometrics and Statistics), Bilbao (Spain)

    2012-06-15

    Time series of seasonal temperatures recorded in Alaska during the past eighty years are analyzed. A common practice to measure changes in the long-term pattern of temperature series is to fit a deterministic linear trend. A deterministic trend is not a realistic approach and poses some pitfalls from the statistical point of view. A statistical model to fit a latent time-varying level independent of the Pacific climate shift is proposed. The empirical distribution of temperature conditional on the phase of the Pacific Decadal Oscillation is obtained. The results reveal that the switch between the negative and the positive phase leads to differences in temperatures up to 4 C in a given location and season. Differences across seasons and locations are detected. The effect of the Pacific climate shift is stronger in winter. An overall increase of temperatures is observed in the long term. The estimated trends are not constant but exhibit different patterns that vary in the sign and strength over the sample period. (orig.)

  6. Teaching Practice in a Realistic Context

    Institute of Scientific and Technical Information of China (English)

    1994-01-01

    Introduction For two academic years, Teaching Practice (TP), an important component of the in-service Advanced Teacher Training Course at Chongqing University, was set up in an almost ideal context: students specially recruited, materials selected by the trainees and photo-copying facilities provided. While trainees felt very excited by their performances during TP, when all the excitement had gone, they felt less optimistic at the thought of the constraints which would be placed on their teaching in their own institutions. So their evaluation of TP ended with a "Yes, but" statement. Drawing on insights gained from those two years, the British lecturers and Chinese counterparts generally felt that TP should, in future, operate within a realistic context. In this paper, we shall report how TP was conducted within our Foreign Languages Department.

  7. Realistic Detectability of Close Interstellar Comets

    CERN Document Server

    Cook, Nathaniel V; Granvik, Mikael; Stephens, Denise C

    2016-01-01

    During the planet formation process, billions of comets are created and ejected into interstellar space. The detection and characterization of such interstellar comets (also known as extra-solar planetesimals or extra-solar comets) would give us in situ information about the efficiency and properties of planet formation throughout the galaxy. However, no interstellar comets have ever been detected, despite the fact that their hyperbolic orbits would make them readily identifiable as unrelated to the solar system. Moro-Mart\\'in et al. 2009 have made a detailed and reasonable estimate of the properties of the interstellar comet population. We extend their estimates of detectability with a numerical model that allows us to consider "close" interstellar comets, e.g., those that come within the orbit of Jupiter. We include several constraints on a "detectable" object that allow for realistic estimates of the frequency of detections expected from the Large Synoptic Survey Telescope (LSST) and other surveys. The inf...

  8. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  9. Realistic Multimedia Simulations for Informatics Students

    Directory of Open Access Journals (Sweden)

    Ioannis Pachoulaki

    2012-08-01

    Full Text Available Realistic multimedia simulations are effective in helping students overcome their fear of physics and gain fundamental knowledge of physical processes. An elective course has been designed in the Applied Informatics and Multimedia Department at TEI of Crete to help informatics students overcome their physics shyness by hands-on experience on scientific multimedia simulations. The approach is justified in terms of the rich employment opportunities in the game and multimedia industries where a sound basis in physics, mathematics and numerical analysis is a necessity. Student feedback shows that they embrace the adopted approach, which uses open source tools to minimize programming so as to allow both instructor and students to focus on the science and complete a greater number of simulations.

  10. Performance Evaluation of MANET in Realistic Environment

    Directory of Open Access Journals (Sweden)

    Shailender Gupta

    2012-07-01

    Full Text Available In order to facilitate communication in Mobile Ad hoc Network (MANET, routing protocols are developed. The performance of these protocols depends upon various factors such as: transmission range, number of nodes deployed and mobility of the nodes. Another factor which affects the performance of MANET routing protocols is the environment in which ad hoc network is deployed. The MANET environment may contain obstacles such as mountains lakes, buildings and river. These obstacles restrict nodes movement but may or may not obstruct the effective transmission range of nodes deployed. This paper is an effort to evaluate the performance of MANET routing protocols in presence of obstacles by designing a simulator in MATLAB-10. To make the situation more realistic obstacle of different shapes, size, number and type were introduced in the simulation region. We found significant impact of the same on the performance of routing protocols

  11. E. H. Carr: En kompleks realist

    DEFF Research Database (Denmark)

    Meyn, Carina Ann

    2008-01-01

    Edward Hallett Carr er blevet kaldt alt fra realist til utopist til slet og ret ,,farlig". Denne artikel giver en bedre forstaelse af, hvorfor et centralt pejlemærke i disci­plinen international politik bliver beskrevet på så mange - tilsyneladende - modsætningsfyldte måder. I stedet for at tage...... udgangspunkt i en realistisk eller en utopisk læsning af Carrs værker bidrager denne artikel med en tredje læsning af Carrs teori, der samtænker ,,den realistiske Carr" og ,,den utopiske Carr". Det overordnede argument i artiklen er, at for at kunne finde mening og sammen­hæng i Carrs politiske teori, ma man...... nødvendigvis samtænke Carrs realisme og utopisme med hans underliggende radikalisme....

  12. The Radiative Tail of Realistic Gravitational Collapse

    CERN Document Server

    Hod, S

    2000-01-01

    An astrophysically realistic model of wave dynamics in black-hole spacetimes must involve a {\\it non}-spherical background geometry with {\\it angular momentum}. We consider the evolution of {\\it gravitational} (and electromagnetic) perturbations in {\\it rotating} Kerr spacetimes. We show that a rotating Kerr black hole becomes ``bald'' {\\it slower} than the corresponding spherically-symmetric Schwarzschild black hole. Moreover, our results {\\it turn over} the traditional belief (which has been widely accepted during the last three decades) that the late-time tail of gravitational collapse is universal. In particular, we show that different fields have {\\it different} decaying rates. Our results are also of importance both to the study of the no-hair conjecture and the mass-inflation scenario (stability of Cauchy horizons).

  13. Helioseismology of a Realistic Magnetoconvective Sunspot Simulation

    Science.gov (United States)

    Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L., Jr.

    2012-01-01

    We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.

  14. Nonsymmetrized hyperspherical harmonics with realistic NN potentials

    CERN Document Server

    Deflorian, Sergio; Leidemann, Winfried; Orlandini, Giuseppina

    2012-01-01

    The Schroedinger equation is solved for an A-nucleon system using an expansion of the wave function in nonsymmetrized hyperspherical harmonics. Our approach is both an extension and a modification of the formalism developed by Gattobigio et al.. The extension consists in the inclusion of spin and isospin degrees of freedom such that a calculation with more realistic NN potential models becomes possible, whereas the modification allows a much simpler determination of the fermionic ground state. The approach is applied to four- and six-body nuclei (4He, 6Li) with various NN potential models. It is shown that the results for ground-state energy and radius agree well with those from the literature.

  15. Presupernova neutrinos: realistic emissivities from stellar evolution

    Science.gov (United States)

    Patton, Kelly; Lunardini, Cecilia; Farmer, Rob; Timmes, Frank

    2017-01-01

    We present a calculation of neutrino emissivities and energy spectra from a presupernova, a massive star going through the advanced stages of nuclear burning before becoming a supernova. Neutrinos produced from beta decay and electron capture, as well as pair annihilation, plasmon decay, and the photoneutrino process are included. We use the state of the art stellar evolution code MESA to obtain realistic conditions for temperature, density, electron fraction, and nuclear isotopic composition. We have found that beta processes contribute significantly to the neutrino flux at potentially detectable energies of a few MeV. Estimates for the number of events at several current and future detectors are presented for the last few hours before collapse.

  16. Design and validation of realistic breast models for use in multiple alternative forced choice virtual clinical trials.

    Science.gov (United States)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R; Young, Kenneth C; Cooke, Victoria; Wilkinson, Louise; Given-Wilson, Rosalind M; Wallis, Matthew G; Wells, Kevin

    2017-04-07

    A novel method has been developed for generating quasi-realistic voxel phantoms which simulate the compressed breast in mammography and digital breast tomosynthesis (DBT). The models are suitable for use in virtual clinical trials requiring realistic anatomy which use the multiple alternative forced choice (AFC) paradigm and patches from the complete breast image. The breast models are produced by extracting features of breast tissue components from DBT clinical images including skin, adipose and fibro-glandular tissue, blood vessels and Cooper's ligaments. A range of different breast models can then be generated by combining these components. Visual realism was validated using a receiver operating characteristic (ROC) study of patches from simulated images calculated using the breast models and from real patient images. Quantitative analysis was undertaken using fractal dimension and power spectrum analysis. The average areas under the ROC curves for 2D and DBT images were 0.51  ±  0.06 and 0.54  ±  0.09 demonstrating that simulated and real images were statistically indistinguishable by expert breast readers (7 observers); errors represented as one standard error of the mean. The average fractal dimensions (2D, DBT) for real and simulated images were (2.72  ±  0.01, 2.75  ±  0.01) and (2.77  ±  0.03, 2.82  ±  0.04) respectively; errors represented as one standard error of the mean. Excellent agreement was found between power spectrum curves of real and simulated images, with average β values (2D, DBT) of (3.10  ±  0.17, 3.21  ±  0.11) and (3.01  ±  0.32, 3.19  ±  0.07) respectively; errors represented as one standard error of the mean. These results demonstrate that radiological images of these breast models realistically represent the complexity of real breast structures and can be used to simulate patches from mammograms and DBT images that are indistinguishable from

  17. Design and validation of realistic breast models for use in multiple alternative forced choice virtual clinical trials

    Science.gov (United States)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R.; Young, Kenneth C.; Cooke, Victoria; Wilkinson, Louise; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Wells, Kevin

    2017-04-01

    A novel method has been developed for generating quasi-realistic voxel phantoms which simulate the compressed breast in mammography and digital breast tomosynthesis (DBT). The models are suitable for use in virtual clinical trials requiring realistic anatomy which use the multiple alternative forced choice (AFC) paradigm and patches from the complete breast image. The breast models are produced by extracting features of breast tissue components from DBT clinical images including skin, adipose and fibro-glandular tissue, blood vessels and Cooper’s ligaments. A range of different breast models can then be generated by combining these components. Visual realism was validated using a receiver operating characteristic (ROC) study of patches from simulated images calculated using the breast models and from real patient images. Quantitative analysis was undertaken using fractal dimension and power spectrum analysis. The average areas under the ROC curves for 2D and DBT images were 0.51  ±  0.06 and 0.54  ±  0.09 demonstrating that simulated and real images were statistically indistinguishable by expert breast readers (7 observers); errors represented as one standard error of the mean. The average fractal dimensions (2D, DBT) for real and simulated images were (2.72  ±  0.01, 2.75  ±  0.01) and (2.77  ±  0.03, 2.82  ±  0.04) respectively; errors represented as one standard error of the mean. Excellent agreement was found between power spectrum curves of real and simulated images, with average β values (2D, DBT) of (3.10  ±  0.17, 3.21  ±  0.11) and (3.01  ±  0.32, 3.19  ±  0.07) respectively; errors represented as one standard error of the mean. These results demonstrate that radiological images of these breast models realistically represent the complexity of real breast structures and can be used to simulate patches from mammograms and DBT images that are indistinguishable from

  18. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  19. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  20. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  1. Histoplasmosis Statistics

    Science.gov (United States)

    ... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...

  2. Statistical distributions

    CERN Document Server

    Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.

    2010-01-01

    A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

  3. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  4. Simulation of realistic instrument noise for GRACE follow-on.

    Science.gov (United States)

    Ellmer, Matthias; Mayer-Gürr, Torsten

    2014-05-01

    Computer simulations have been an indispensable tool in assessing and predicting the performance of gravity recovery satellite missions, both present and future. Future satellite missions like GRACE follow-on will measure Earth's gravity with a much higher precision than their predecessors. This increased precision makes it necessary to reevaluate the applicability of current simulation strategies to future gravity missions. In past simulation efforts, effects that are known to be relevant factors for mission performance are often approximated or modeled incompletely. One such effect is the noise applied to simulated observables like precise orbits or K-Band ranges. These noisy observables are generated by adding simple white noise of a specific power to noise-free raw measurements. The noisy observables are then used in closed-loop simulations to quantify the performance of specific instruments, or a mission scenario as a whole. This work presents strategies to generate more realistic noise for satellite missions as implemented in the GROOPS (Gravity Recovery Object Orientated Programming System) software package. A generic interface for different noise generators is implemented in GROOPS. This interface is used to add different types of noise, such as white noise, colored or correlated noise, or noise with a given power spectral density to generated observables. It is thus possible to study the effect of the chosen noise model on the generated observable, and conversely the recovered gravity field as a whole. A better knowledge of the noise characteristics of the instruments on GRACE and GRACE follow-on will allow us to improve our understanding of their complex interactions. It will also allow us to improve our processing strategies for both simulated and real data, and will thus lead to a more precise and better understood recovered gravity field.

  5. The realistic meaning of Larry’s image in The Razor’s Edge

    Institute of Scientific and Technical Information of China (English)

    王国婧

    2015-01-01

    Larry, a character in the novel The Razor's Edge written by British novelist Maugham, was shocked in the First World War and went into a way of exploring the meaning of life. In this paper, the author ana-lyzed the image of Larry as a free explorer and a selfless rescuer so as to probe into the image of Larry further and researched its corresponding realistic meaning; especially his image played an important role in guiding the explorative generation after the Second World War and representing the modern people in the realistic society. Larry surmounted the boundaries of the time and space, which revealed the plight of the people and showed a way of ideal life with realistic meaning.

  6. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  7. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  8. Practical Statistics

    CERN Document Server

    Lyons, L

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  9. Performance of Airborne Precision Spacing Under Realistic Wind Conditions

    Science.gov (United States)

    Wieland, Frederick; Santos, Michel; Krueger, William; Houston, Vincent E.

    2011-01-01

    With the expected worldwide increase of air traffic during the coming decade, both the Federal Aviation Administration s (FAA s) Next Generation Air Transportation System (NextGen), as well as Eurocontrol s Single European Sky ATM Research (SESAR) program have, as part of their plans, air traffic management solutions that can increase performance without requiring time-consuming and expensive infrastructure changes. One such solution involves the ability of both controllers and flight crews to deliver aircraft to the runway with greater accuracy than is possible today. Previous research has shown that time-based spacing techniques, wherein the controller assigns a time spacing to each pair of arriving aircraft, is one way to achieve this goal by providing greater runway delivery accuracy that produces a concomitant increase in system-wide performance. The research described herein focuses on a specific application of time-based spacing, called Airborne Precision Spacing (APS), which has evolved over the past ten years. This research furthers APS understanding by studying its performance with realistic wind conditions obtained from atmospheric sounding data and with realistic wind forecasts obtained from the Rapid Update Cycle (RUC) short-range weather forecast. In addition, this study investigates APS performance with limited surveillance range, as provided by the Automatic Dependent Surveillance-Broadcast (ADS-B) system, and with an algorithm designed to improve APS performance when an ADS-B signal is unavailable. The results presented herein quantify the runway threshold delivery accuracy of APS un-der these conditions, and also quantify resulting workload metrics such as the number of speed changes required to maintain spacing.

  10. Realistic Scheduling Mechanism for Smart Homes

    Directory of Open Access Journals (Sweden)

    Danish Mahmood

    2016-03-01

    Full Text Available In this work, we propose a Realistic Scheduling Mechanism (RSM to reduce user frustration and enhance appliance utility by classifying appliances with respective constraints and their time of use effectively. Algorithms are proposed regarding functioning of home appliances. A 24 hour time slot is divided into four logical sub-time slots, each composed of 360 min or 6 h. In these sub-time slots, only desired appliances (with respect to appliance classification are scheduled to raise appliance utility, restricting power consumption by a dynamically modelled power usage limiter that does not only take the electricity consumer into account but also the electricity supplier. Once appliance, time and power usage limiter modelling is done, we use a nature-inspired heuristic algorithm, Binary Particle Swarm Optimization (BPSO, optimally to form schedules with given constraints representing each sub-time slot. These schedules tend to achieve an equilibrium amongst appliance utility and cost effectiveness. For validation of the proposed RSM, we provide a comparative analysis amongst unscheduled electrical load usage, scheduled directly by BPSO and RSM, reflecting user comfort, which is based upon cost effectiveness and appliance utility.

  11. Towards a realistic description of hadron resonances

    Science.gov (United States)

    Schmidt, R. A.; Canton, L.; Schweiger, W.; Plessas, W.

    2016-08-01

    We report on our attempts of treating excited hadron states as true quantum resonances. Hitherto the spectroscopy of mesons, usually considered as quark-antiquark systems, and of baryons, usually considered as three-quark systems, has been treated through excitation spectra of bound states (namely, confined few-quark systems), corresponding to poles of the quantum-mechanical resolvent at real negative values in the complex energy plane. As a result the wave functions, i.e. the residua of the resolvent, have not exhibited the behaviour as required for hadron resonances with their multiple decay modes. This has led to disturbing shortcomings in the description of hadronic resonance phenomena. We have aimed at a more realistic description of hadron resonances within relativistic constituent-quark models taking into account explicitly meson-decay channels. The corresponding coupled-channels theory is based on a relativistically invariant mass operator capable of producing hadron ground states with real energies and hadron resonances with complex energies, the latter corresponding to poles in the lower half-plane of the unphysical sheet of the complex energy plane. So far we have demonstrated the feasibility of the coupled-channels approach to hadron resonances along model calculations producing indeed the desired properties. The corresponding spectral properties will be discussed in this contribution. More refined studies are under way towards constructing a coupled-channels relativistic constituent-quark model for meson and baryon resonances.

  12. Pricing European Options in Realistic Markets

    CERN Document Server

    Schaden, M

    2002-01-01

    We investigate the relation between the fair price for European-style vanilla options and the distribution of short-term returns on the underlying asset ignoring transaction and other costs. We compute the risk-neutral probability density conditional on the total variance of the asset's returns when the option expires. If the asset's future price has finite expectation, the option's fair value satisfies a parabolic partial differential equation of the Black-Scholes type in which the variance of the asset's returns rather than a trading time is the evolution parameter. By immunizing the portfolio against large-scale price fluctuations of the asset, the valuation of options is extended to the realistic case\\cite{St99} of assets whose short-term returns have finite variance but very large, or even infinite, higher moments. A dynamic Delta-hedged portfolio that is statically insured against exceptionally large fluctuations includes at least two different options on the asset. The fair value of an option in this c...

  13. Determination of Realistic Fire Scenarios in Spacecraft

    Science.gov (United States)

    Dietrich, Daniel L.; Ruff, Gary A.; Urban, David

    2013-01-01

    This paper expands on previous work that examined how large a fire a crew member could successfully survive and extinguish in the confines of a spacecraft. The hazards to the crew and equipment during an accidental fire include excessive pressure rise resulting in a catastrophic rupture of the vehicle skin, excessive temperatures that burn or incapacitate the crew (due to hyperthermia), carbon dioxide build-up or accumulation of other combustion products (e.g. carbon monoxide). The previous work introduced a simplified model that treated the fire primarily as a source of heat and combustion products and sink for oxygen prescribed (input to the model) based on terrestrial standards. The model further treated the spacecraft as a closed system with no capability to vent to the vacuum of space. The model in the present work extends this analysis to more realistically treat the pressure relief system(s) of the spacecraft, include more combustion products (e.g. HF) in the analysis and attempt to predict the fire spread and limiting fire size (based on knowledge of terrestrial fires and the known characteristics of microgravity fires) rather than prescribe them in the analysis. Including the characteristics of vehicle pressure relief systems has a dramatic mitigating effect by eliminating vehicle overpressure for all but very large fires and reducing average gas-phase temperatures.

  14. Sialendoscopy Training: Presentation of a Realistic Model.

    Science.gov (United States)

    Pascoto, Gabriela Robaskewicz; Stamm, Aldo Cassol; Lyra, Marcos

    2017-01-01

    Introduction Several surgical training simulators have been created for residents and young surgeons to gain experience with surgical procedures. Laboratory training is fundamental for acquiring familiarity with the techniques of surgery and skill in handing instruments. Objective The aim of this study is to present a novel simulator for training sialendoscopy. Method This realistic simulator was built with a synthetic thermo-retractile, thermo-sensible rubber which, when combined with different polymers, produces more than 30 different formulas. These formulas present textures, consistencies, and mechanical resistance are similar to many human tissues. Results The authors present a training model to practice sialendoscopy. All aspects of the procedure are simulated: month opening, dilatation of papillae, insert of the scope, visualization of stones, extraction of these stones with grasping or baskets, and finally, stone fragmentation with holmium laser. Conclusion This anatomical model for sialendoscopy training should be considerably useful to abbreviate the learning curve during the qualification of young surgeons while minimizing the consequences of technical errors.

  15. Cerebral blood flow simulations in realistic geometries

    Directory of Open Access Journals (Sweden)

    Szopos Marcela

    2012-04-01

    Full Text Available The aim of this work is to perform the computation of the blood flow in all the cerebral network, obtained from medical images as angiographies. We use free finite elements codes as FreeFEM++. We first test the code on analytical solutions in simplified geometries. Then, we study the influence of boundary conditions on the flow and we finally perform first computations on realistic meshes. L’objectif est ici de simuler l’écoulement sanguin dans tout le réseau cérébral (artériel et veineux obtenu à partir d’angiographies cérébrales 3D à l’aide de logiciels d’éléments finis libres, comme FreeFEM++. Nous menons d’abord une étude détaillée des résultats sur des solutions analytiques et l’influence des conditions limites à imposer dans des géométries simplifiées avant de travailler sur les maillages réalistes.

  16. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin [Center for Device Thermography and Reliability (CDTR), H. H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom)

    2015-01-26

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.

  17. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2005-01-01

    In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin

  18. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2010-01-01

    In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this

  19. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  20. Statistical physics

    CERN Document Server

    Wannier, Gregory H

    2010-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  1. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  2. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  3. SEER Statistics

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  4. Cancer Statistics

    Science.gov (United States)

    ... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...

  5. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  6. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  7. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  8. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  9. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  10. Realistic Detectability of Close Interstellar Comets

    Science.gov (United States)

    Cook, Nathaniel V.; Ragozzine, Darin; Granvik, Mikael; Stephens, Denise C.

    2016-07-01

    During the planet formation process, billions of comets are created and ejected into interstellar space. The detection and characterization of such interstellar comets (ICs) (also known as extra-solar planetesimals or extra-solar comets) would give us in situ information about the efficiency and properties of planet formation throughout the galaxy. However, no ICs have ever been detected, despite the fact that their hyperbolic orbits would make them readily identifiable as unrelated to the solar system. Moro-Martín et al. have made a detailed and reasonable estimate of the properties of the IC population. We extend their estimates of detectability with a numerical model that allows us to consider “close” ICs, e.g., those that come within the orbit of Jupiter. We include several constraints on a “detectable” object that allow for realistic estimates of the frequency of detections expected from the Large Synoptic Survey Telescope (LSST) and other surveys. The influence of several of the assumed model parameters on the frequency of detections is explored in detail. Based on the expectation from Moro-Martín et al., we expect that LSST will detect 0.001-10 ICs during its nominal 10 year lifetime, with most of the uncertainty from the unknown number density of small (nuclei of ˜0.1-1 km) ICs. Both asteroid and comet cases are considered, where the latter includes various empirical prescriptions of brightening. Using simulated LSST-like astrometric data, we study the problem of orbit determination for these bodies, finding that LSST could identify their orbits as hyperbolic and determine an ephemeris sufficiently accurate for follow-up in about 4-7 days. We give the hyperbolic orbital parameters of the most detectable ICs. Taking the results into consideration, we give recommendations to future searches for ICs.

  11. On the realistic validation of photometric redshifts

    Science.gov (United States)

    Beck, R.; Lin, C.-A.; Ishida, E. E. O.; Gieseke, F.; de Souza, R. S.; Costa-Duarte, M. V.; Hattab, M. W.; Krone-Martins, A.

    2017-07-01

    Two of the main problems encountered in the development and accurate validation of photometric redshift (photo-z) techniques are the lack of spectroscopic coverage in the feature space (e.g. colours and magnitudes) and the mismatch between the photometric error distributions associated with the spectroscopic and photometric samples. Although these issues are well known, there is currently no standard benchmark allowing a quantitative analysis of their impact on the final photo-z estimation. In this work, we present two galaxy catalogues, Teddy and Happy, built to enable a more demanding and realistic test of photo-z methods. Using photometry from the Sloan Digital Sky Survey and spectroscopy from a collection of sources, we constructed data sets that mimic the biases between the underlying probability distribution of the real spectroscopic and photometric sample. We demonstrate the potential of these catalogues by submitting them to the scrutiny of different photo-z methods, including machine learning (ML) and template fitting approaches. Beyond the expected bad results from most ML algorithms for cases with missing coverage in the feature space, we were able to recognize the superiority of global models in the same situation and the general failure across all types of methods when incomplete coverage is convoluted with the presence of photometric errors - a data situation which photo-z methods were not trained to deal with up to now and which must be addressed by future large-scale surveys. Our catalogues represent the first controlled environment allowing a straightforward implementation of such tests. The data are publicly available within the COINtoolbox (https://github.com/COINtoolbox/photoz_catalogues).

  12. Non-tachyonic semi-realistic non-supersymmetric heterotic-string vacua

    Energy Technology Data Exchange (ETDEWEB)

    Ashfaque, Johar M.; Athanasopoulos, Panos; Faraggi, Alon E.; Sonmez, Hasan [University of Liverpool, Department of Mathematical Sciences, Liverpool (United Kingdom)

    2016-04-15

    The heterotic-string models in the free fermionic formulation gave rise to some of the most realistic-string models to date, which possess N = 1 spacetime supersymmetry. Lack of evidence for supersymmetry at the LHC instigated recent interest in non-supersymmetric heterotic-string vacua. We explore what may be learned in this context from the quasi-realistic free fermionic models. We show that constructions with a low number of families give rise to proliferation of a priori tachyon producing sectors, compared to the non-realistic examples, which typically may contain only one such sector. The reason being that in the realistic cases the internal six dimensional space is fragmented into smaller units. We present one example of a quasi-realistic, non-supersymmetric, non-tachyonic, heterotic-string vacuum and compare the structure of its massless spectrum to the corresponding supersymmetric vacuum. While in some sectors supersymmetry is broken explicitly, i.e. the bosonic and fermionic sectors produce massless and massive states, other sectors, and in particular those leading to the chiral families, continue to exhibit Fermi-Bose degeneracy. In these sectors the massless spectrum, as compared to the supersymmetric cases, will only differ in some local or global U(1) charges. We discuss the conditions for obtaining n{sub b} = n{sub f} at the massless level in these models. Our example model contains an anomalous U(1) symmetry, which generates a tadpole diagram at one-loop order in string perturbation theory. We speculate that this tadpole diagram may cancel the corresponding diagram generated by the one-loop non-vanishing vacuum energy and that in this respect the supersymmetric and non-supersymmetric vacua should be regarded on an equal footing. Finally we discuss vacua that contain two supersymmetry generating sectors. (orig.)

  13. Non-tachyonic semi-realistic non-supersymmetric heterotic-string vacua

    Science.gov (United States)

    Ashfaque, Johar M.; Athanasopoulos, Panos; Faraggi, Alon E.; Sonmez, Hasan

    2016-04-01

    The heterotic-string models in the free fermionic formulation gave rise to some of the most realistic-string models to date, which possess N=1 spacetime supersymmetry. Lack of evidence for supersymmetry at the LHC instigated recent interest in non-supersymmetric heterotic-string vacua. We explore what may be learned in this context from the quasi-realistic free fermionic models. We show that constructions with a low number of families give rise to proliferation of a priori tachyon producing sectors, compared to the non-realistic examples, which typically may contain only one such sector. The reason being that in the realistic cases the internal six dimensional space is fragmented into smaller units. We present one example of a quasi-realistic, non-supersymmetric, non-tachyonic, heterotic-string vacuum and compare the structure of its massless spectrum to the corresponding supersymmetric vacuum. While in some sectors supersymmetry is broken explicitly, i.e. the bosonic and fermionic sectors produce massless and massive states, other sectors, and in particular those leading to the chiral families, continue to exhibit Fermi-Bose degeneracy. In these sectors the massless spectrum, as compared to the supersymmetric cases, will only differ in some local or global U(1) charges. We discuss the conditions for obtaining n_b=n_f at the massless level in these models. Our example model contains an anomalous U(1) symmetry, which generates a tadpole diagram at one-loop order in string perturbation theory. We speculate that this tadpole diagram may cancel the corresponding diagram generated by the one-loop non-vanishing vacuum energy and that in this respect the supersymmetric and non-supersymmetric vacua should be regarded on an equal footing. Finally we discuss vacua that contain two supersymmetry generating sectors.

  14. Validating Time-Distance Helioseismology With Realistic Quiet Sun Simulations

    CERN Document Server

    DeGrave, K; Rempel, M

    2014-01-01

    Linear time-distance helioseismic inversions are carried out for vector flow velocities using travel times measured from two $\\sim 100^2\\,{\\rm Mm^2}\\times 20\\,{\\rm Mm}$ realistic magnetohydrodynamic quiet-Sun simulations of about 20 hr. The goal is to test current seismic methods on these state-of-the-art simulations. Using recent three-dimensional inversion schemes, we find that inverted horizontal flow maps correlate well with the simulations in the upper $\\sim 3$ Mm of the domains for several filtering schemes, including phase-speed, ridge, and combined phase-speed and ridge measurements. In several cases, however, the velocity amplitudes from the inversions severely underestimate those of the simulations, possibly indicating nonlinearity of the forward problem. We also find that, while near-surface inversions of the vertical velocites are best using phase-speed filters, in almost all other example cases these flows are irretrievable due to noise, suggesting a need for statistical averaging to obtain bette...

  15. Broadband biphoton generation and statistics of quantum light in the UV-visible range in an AlGaN microring resonator.

    Science.gov (United States)

    De Leonardis, Francesco; Soref, Richard A; Soltani, Mohammad; Passaro, Vittorio M N

    2017-09-12

    We present a physical investigation on the generation of correlated photon pairs that are broadly spaced in the ultraviolet (UV) and visible spectrum on a AlGaN/AlN integrated photonic platform which is optically transparent at these wavelengths. Using spontaneous four wave mixing (SFWM) in an AlGaN microring resonator, we show design techniques to satisfy the phase matching condition between the optical pump, the signal, and idler photon pairs, a condition which is essential and is a key hurdle when operating at short wavelength due to the strong normal dispersion of the material. Such UV-visible photon pairs are quite beneficial for interaction with qubit ions that are mostly in this wavelength range, and will enable heralding the photon-ion interaction. As a target application example, we present the systematic AlGaN microresonator design for generating signal and idler photon pairs using a blue wavelength pump, while the signal appears at the transition of ytterbium ion ((171)Yb(+), 369.5 nm) and the idler appears in the far blue or green range. The photon pairs have minimal crosstalk to the pump power due to their broad spacing in spectral wavelength, thereby relaxing the design of on-chip integrated filters for separating pump, signal and idler.

  16. Statistical laws in linguistics

    CERN Document Server

    Altmann, Eduardo G

    2015-01-01

    Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws...

  17. Realistic CT simulation using the 4D XCAT phantom.

    Science.gov (United States)

    Segars, W P; Mahesh, M; Beck, T J; Frey, E C; Tsui, B M W

    2008-08-01

    The authors develop a unique CT simulation tool based on the 4D extended cardiac-torso (XCAT) phantom, a whole-body computer model of the human anatomy and physiology based on NURBS surfaces. Unlike current phantoms in CT based on simple mathematical primitives, the 4D XCAT provides an accurate representation of the complex human anatomy and has the advantage, due to its design, that its organ shapes can be changed to realistically model anatomical variations and patient motion. A disadvantage to the NURBS basis of the XCAT, however, is that the mathematical complexity of the surfaces makes the calculation of line integrals through the phantom difficult. They have to be calculated using iterative procedures; therefore, the calculation of CT projections is much slower than for simpler mathematical phantoms. To overcome this limitation, the authors used efficient ray tracing techniques from computer graphics, to develop a fast analytic projection algorithm to accurately calculate CT projections directly from the surface definition of the XCAT phantom given parameters defining the CT scanner and geometry. Using this tool, realistic high-resolution 3D and 4D projection images can be simulated and reconstructed from the XCAT within a reasonable amount of time. In comparison with other simulators with geometrically defined organs, the XCAT-based algorithm was found to be only three times slower in generating a projection data set of the same anatomical structures using a single 3.2 GHz processor. To overcome this decrease in speed would, therefore, only require running the projection algorithm in parallel over three processors. With the ever decreasing cost of computers and the rise of faster processors and multi-processor systems and clusters, this slowdown is basically inconsequential, especially given the vast improvement the XCAT offers in terms of realism and the ability to generate 3D and 4D data from anatomically diverse patients. As such, the authors conclude

  18. Do advanced statistical techniques really help in the diagnosis of the metabolic syndrome in patients treated with second-generation antipsychotics?

    Science.gov (United States)

    Van Schependom, Jeroen; Yu, Weiping; Gielen, Jeroen; Laton, Jorne; De Keyser, Jacques; De Hert, Marc; Nagels, Guy

    2015-10-01

    Metabolic and cardiovascular diseases in patients with schizophrenia have gained a lot of interest in recent years. Developing an algorithm to detect the metabolic syndrome based on readily available variables would eliminate the need for blood sampling, which is considered expensive and inconvenient in this population. All patients fulfilled DSM-IV diagnosis of schizophrenia or schizoaffective disorder. We used the International Diabetes Federation criteria (European population) to diagnose the metabolic syndrome. We used logistic regression and optimized artificial neural networks and support vector machines to detect the metabolic syndrome in a cohort of schizophrenic patients of the University Psychiatric Center Kortenberg, KU Leuven, Belgium. Testing was done on one-third of the included cohort (202 patients); training was performed using a 10-fold stratified cross-validation scheme. The data were collected between 2000 and 2008. All 3 methods yielded similar results, with satisfying accuracies of about 80%. However, none of the advanced statistical methods could improve on the results obtained using a very simple and naive model including only central obesity and information on blood pressure. Although so-called pattern recognition techniques bear high promise in improving clinical decision making, the results should be presented with caution and preferably in comparison with a less complicated technique. © Copyright 2015 Physicians Postgraduate Press, Inc.

  19. Statistical evaluation of transcriptomic data generated using the Affymetrix one-cycle, two-cycle and IVT-Express RNA labelling protocols with the Arabidopsis ATH1 microarray

    Directory of Open Access Journals (Sweden)

    Hodgman T

    2010-03-01

    Full Text Available Abstract Background Microarrays are a powerful tool used for the determination of global RNA expression. There is an increasing requirement to focus on profiling gene expression in tissues where it is difficult to obtain large quantities of material, for example individual tissues within organs such as the root, or individual isolated cells. From such samples, it is difficult to produce the amount of RNA required for labelling and hybridisation in microarray experiments, thus a process of amplification is usually adopted. Despite the increasing use of two-cycle amplification for transcriptomic analyses on the Affymetrix ATH1 array, there has been no report investigating any potential bias in gene representation that may occur as a result. Results Here we compare transcriptomic data generated using Affymetrix one-cycle (standard labelling protocol, two-cycle (small-sample protocol and IVT-Express protocols with the Affymetrix ATH1 array using Arabidopsis root samples. Results obtained with each protocol are broadly similar. However, we show that there are 35 probe sets (of a total of 22810 that are misrepresented in the two-cycle data sets. Of these, 33 probe sets were classed as mis-amplified when comparisons of two independent publicly available data sets were undertaken. Conclusions Given the unreliable nature of the highlighted probes, we caution against using data associated with the corresponding genes in analyses involving transcriptomic data generated with two-cycle amplification protocols. We have shown that the Affymetrix IVT-E labelling protocol produces data with less associated bias than the two-cycle protocol, and as such, would recommend this kit for new experiments that involve small samples.

  20. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  1. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  2. The Bayesian statistical decision theory applied to the optimization of generating set maintenance; La theorie de la decision statistique bayesienne appliquee a l`optimisation de la maintenance des groupes electrogenes

    Energy Technology Data Exchange (ETDEWEB)

    Procaccia, H.; Cordier, R.; Muller, S.

    1994-11-01

    The difficulty in RCM methodology is the allocation of a new periodicity of preventive maintenance on one equipment when a critical failure has been identified: until now this new allocation has been based on the engineer`s judgment, and one must wait for a full cycle of feedback experience before to validate it. Statistical decision theory could be a more rational alternative for the optimization of preventive maintenance periodicity. This methodology has been applied to inspection and maintenance optimization of cylinders of diesel generator engines of 900 MW nuclear plants, and has shown that previous preventive maintenance periodicity can be extended. (authors). 8 refs., 5 figs.

  3. Statistical postprocessing of precipitation generated with the mesoscale model FOOT3DK for the rainy season 2002 in Benin (West Africa)

    Science.gov (United States)

    Ludwig, P.; Krüger, A.; Born, K.; Kerschgens, M.

    2009-04-01

    The presented study deals with various aspects of mesoscale modelling of precipitation in the Sudanian Region at the West African subcontinent and is imbedded into an interdisciplinary research project called IMPETUS. Environmental conditions of synoptic-scale features, being responsible regarding generation and propagation of several precipitation systems, are mostly influenced by the West African Monsoon system. For investigations of precipitation events during the rainy season of 2002, a total number of 40 precipitation episodes, with durations ranging from 54 to 72 hours for each episode, were simulated using the non-hydrostatic model FOOT3DK. While the spatial resolution is 3 km, the temporal resolution accounts for 1 hour. Input data for these simulations is provided using a model chain consisting GME-analysis, Lokal-Modell (0.25° resolution) and FOOT3DK (9 km resolution). The investigated area covers a region of 105 km x 105 km (35 x 35 grid points) and is situated in the upper river catchment of the Ouémé in Benin. On basis of a total number of 50 rainfall recording stations comparisons between simulated and observed precipitation within this study has been carried out. The recording stations are irregularly spaced across the investigated area. The accomplished comparison of precipitation adverts to the necessity for an adjustment of simulated rainfall by FOOT3DK. Therefore, an adjustment technique has been developed and is presented within this study. Based on the 1225 grid points this method fits the hourly simulated towards the hourly observed rainfall rates using a mapping function. In a first step the station data has to be interpolated to the same underlying model grid. Afterwards, the desired relation between simulated and observed precipitation can be established by fitting a sigmoidal curve to the precipitation data using the Levenberg-Marquardt-Algorithm. For every single grid mesh a separate Gompertz function can be found and employed. The

  4. Realistic and interactive high-resolution 4D environments for real-time surgeon and patient interaction.

    Science.gov (United States)

    Smith, L N; Farooq, A R; Smith, M L; Ivanov, I E; Orlando, A

    2017-06-01

    Remote consultations that are realistic enough to be useful medically offer considerable clinical, logistical and cost benefits. Despite advances in virtual reality and vision hardware and software, these benefits are currently often unrealised. The proposed approach combines high spatial and temporal resolution 3D and 2D machine vision with virtual reality techniques, in order to develop new environments and instruments that will enable realistic remote consultations and the generation of new types of useful clinical data. New types of clinical data have been generated for skin analysis and respiration measurement; and the combination of 3D with 2D data was found to offer potential for the generation of realistic virtual consultations. An innovative combination of high resolution machine vision data and virtual reality online methods, promises to provide advanced functionality and significant medical benefits, particularly in regions where populations are dispersed or access to clinicians is limited. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  6. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  7. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  8. Statistical methods

    CERN Document Server

    Freund, Rudolf J; Wilson, William J

    2010-01-01

    Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

  9. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  10. In Vitro Tests for Aerosol Deposition. V: Using Realistic Testing to Estimate Variations in Aerosol Properties at the Trachea.

    Science.gov (United States)

    Wei, Xiangyin; Hindle, Michael; Delvadia, Renishkumar R; Byron, Peter R

    2017-03-23

    The dose and aerodynamic particle size distribution (APSD) of drug aerosols' exiting models of the mouth and throat (MT) during a realistic inhalation profile (IP) may be estimated in vitro and designated Total Lung Dose, TLDin vitro, and APSDTLDin vitro, respectively. These aerosol characteristics likely define the drug's regional distribution in the lung. A general method was evaluated to enable the simultaneous determination of TLDin vitro and APSDTLDin vitro for budesonide aerosols' exiting small, medium and large VCU-MT models. Following calibration of the modified next generation pharmaceutical impactor (NGI) at 140 L/min, variations in aerosol dose and size exiting MT were determined from Budelin(®) Novolizer(®) across the IPs reported by Newman et al., who assessed drug deposition from this inhaler by scintigraphy. Values for TLDin vitro from the test inhaler determined by the general method were found to be statistically comparable to those using a filter capture method. Using new stage cutoffs determined by calibration of the modified NGI at 140 L/min, APSDTLDin vitro profiles and mass median aerodynamic diameters at the MT exit (MMADTLDin vitro) were determined as functions of MT geometric size across Newman's IPs. The range of mean values (n ≥ 5) for TLDin vitro and MMADTLDin vitro for this inhaler extended from 6.2 to 103.0 μg (3.1%-51.5% of label claim) and from 1.7 to 3.6 μm, respectively. The method enables reliable determination of TLDin vitro and APSDTLDin vitro for aerosols likely to enter the trachea of test subjects in the clinic. By simulating realistic IPs and testing in different MT models, the effects of major variables on TLDin vitro and APSDTLDin vitro may be studied using the general method described in this study.

  11. A statistical approach for rain intensity differentiation using Meteosat Second Generation-Spinning Enhanced Visible and InfraRed Imager observations

    Science.gov (United States)

    Ricciardelli, E.; Cimini, D.; Di Paola, F.; Romano, F.; Viggiano, M.

    2014-07-01

    This study exploits the Meteosat Second Generation (MSG)-Spinning Enhanced Visible and Infrared Imager (SEVIRI) observations to evaluate the rain class at high spatial and temporal resolutions and, to this aim, proposes the Rain Class Evaluation from Infrared and Visible observation (RainCEIV) technique. RainCEIV is composed of two modules: a cloud classification algorithm which individuates and characterizes the cloudy pixels, and a supervised classifier that delineates the rainy areas according to the three rainfall intensity classes, the non-rainy (rain rate value retrievals based on Atmospheric Microwave Sounder Unit (AMSU)-B observations). RainCEIV's principal aim is that of supplying preliminary qualitative information on the rainy areas within the Mediterranean Basin where there is no radar network coverage. The results of RainCEIV have been validated against radar-derived rainfall measurements from the Italian Operational Weather Radar Network for some case studies limited to the Mediterranean area. The dichotomous assessment related to daytime (nighttime) validation shows that RainCEIV is able to detect rainy/non-rainy areas with an accuracy of about 97% (96%), and when all the rainy classes are considered, it shows a Heidke skill score of 67% (62%), a bias score of 1.36 (1.58), and a probability of detection of rainy areas of 81% (81%).

  12. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  13. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  14. Statistical Mechancis

    CERN Document Server

    Gallavotti, Giovanni

    2011-01-01

    C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.

  15. A statistical approach for rain class evaluation using Meteosat Second Generation-Spinning Enhanced Visible and InfraRed Imager observations

    Science.gov (United States)

    Ricciardelli, E.; Cimini, D.; Di Paola, F.; Romano, F.; Viggiano, M.

    2013-11-01

    Precipitation measurements are essential for short term hydrological and long term climate studies. Operational networks of rain gauges and weather radars provide fairly accurate rain rate measurements, but they leave large areas uncovered. Because of this, satellite remote sensing is a useful tool for the detection and characterization of the raining areas in regions where this information remains missing. This study exploits the Meteosat Second Generation - Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) observations to evaluate the rain class at high spatial and temporal resolutions. The Rain Class Evaluation from Infrared and Visible (RainCEIV) observations technique is proposed. The purpose of RainCEIV is to supply continuous monitoring of convective as well as of stratiform rainfall events. It applies a supervised classifier to the spectral and textural features of infrared and visible MSG-SEVIRI images to classify the cloudy pixels as non rainy, light to moderate rain, or heavy to very heavy rain. The technique considers in input also the water vapour channels brightness temperatures differences for the MSG-SEVIRI images acquired 15/30/45 min before the time of interest. The rainfall rates used in the training phase are obtained with the Precipitation Estimation at Microwave frequencies (PEMW), an algorithm for rain rate retrievals based on Atmospheric Microwave Sounder Unit (AMSU)-B observations. The results of RainCEIV have been validated against radar-derived rainfall measurements from the Italian Operational Weather Radar Network for some case studies limited to the Mediterranean area. The dichotomous assessment shows that RainCEIV is able to detect rainy areas with an accuracy of about 91%, a Heidke skill score of 56%, a Bias score of 1.16, and a Probability of Detection of rainy areas of 66%.

  16. Super stereoscopy technique for comfortable and realistic 3D displays.

    Science.gov (United States)

    Akşit, Kaan; Niaki, Amir Hossein Ghanbari; Ulusoy, Erdem; Urey, Hakan

    2014-12-15

    Two well-known problems of stereoscopic displays are the accommodation-convergence conflict and the lack of natural blur for defocused objects. We present a new technique that we name Super Stereoscopy (SS3D) to provide a convenient solution to these problems. Regular stereoscopic glasses are replaced by SS3D glasses which deliver at least two parallax images per eye through pinholes equipped with light selective filters. The pinholes generate blur-free retinal images so as to enable correct accommodation, while the delivery of multiple parallax images per eye creates an approximate blur effect for defocused objects. Experiments performed with cameras and human viewers indicate that the technique works as desired. In case two, pinholes equipped with color filters per eye are used; the technique can be used on a regular stereoscopic display by only uploading a new content, without requiring any change in display hardware, driver, or frame rate. Apart from some tolerable loss in display brightness and decrease in natural spatial resolution limit of the eye because of pinholes, the technique is quite promising for comfortable and realistic 3D vision, especially enabling the display of close objects that are not possible to display and comfortably view on regular 3DTV and cinema.

  17. Improved transcranial magnetic stimulation coil design with realistic head modeling

    Science.gov (United States)

    Crowther, Lawrence; Hadimani, Ravi; Jiles, David

    2013-03-01

    We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.

  18. Realistic Data-Driven Traffic Flow Animation Using Texture Synthesis.

    Science.gov (United States)

    Chao, Qianwen; Deng, Zhigang; Ren, Jiaping; Ye, Qianqian; Jin, Xiaogang

    2017-01-11

    We present a novel data-driven approach to populate virtual road networks with realistic traffic flows. Specifically, given a limited set of vehicle trajectories as the input samples, our approach first synthesizes a large set of vehicle trajectories. By taking the spatio-temporal information of traffic flows as a 2D texture, the generation of new traffic flows can be formulated as a texture synthesis process, which is solved by minimizing a newly developed traffic texture energy. The synthesized output captures the spatio-temporal dynamics of the input traffic flows, and the vehicle interactions in it strictly follow traffic rules. After that, we position the synthesized vehicle trajectory data to virtual road networks using a cage-based registration scheme, where a few traffic-specific constraints are enforced to maintain each vehicle's original spatial location and synchronize its motion in concert with its neighboring vehicles. Our approach is intuitive to control and scalable to the complexity of virtual road networks. We validated our approach through many experiments and paired comparison user studies.

  19. Modelisation of synchrotron radiation losses in realistic tokamak plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Albajar, F.; Johner, J.; Granata, G

    2000-08-01

    Synchrotron radiation losses become significant in the power balance of high-temperature plasmas envisaged for next step tokamaks. Due to the complexity of the exact calculation, these losses are usually roughly estimated with expressions derived from a plasma description using simplifying assumptions on the geometry, radiation absorption, and density and temperature profiles. In the present article, the complete formulation of the transport of synchrotron radiation is performed for realistic conditions of toroidal plasma geometry with elongated cross-section, using an exact method for the calculation of the absorption coefficient, and for arbitrary shapes of density and temperature profiles. The effects of toroidicity and temperature profile on synchrotron radiation losses are analyzed in detail. In particular, when the electron temperature profile is almost flat in the plasma center, as for example in ITB confinement regimes, synchrotron losses are found to be much stronger than in the case where the profile is represented by its best generalized parabolic approximation, though both cases give approximately the same thermal energy contents. Such an effect is not included in present approximate expressions. Finally, we propose a seven-variable fit for the fast calculation of synchrotron radiation losses. This fit is derived from a large database, which has been generated using a code implementing the complete formulation and optimized for massively parallel computing. (author)

  20. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  1. The use and limitation of realistic evaluation as a tool for evidence-based practice: a critical realist perspective.

    Science.gov (United States)

    Porter, Sam; O'Halloran, Peter

    2012-03-01

    The use and limitation of realistic evaluation as a tool for evidence-based practice: a critical realist perspective In this paper, we assess realistic evaluation's articulation with evidence-based practice (EBP) from the perspective of critical realism. We argue that the adoption by realistic evaluation of a realist causal ontology means that it is better placed to explain complex healthcare interventions than the traditional method used by EBP, the randomized controlled trial (RCT). However, we do not conclude from this that the use of RCTs is without merit, arguing that it is possible to use both methods in combination under the rubric of realist theory. More negatively, we contend that the rejection of critical theory and utopianism by realistic evaluation in favour of the pragmatism of piecemeal social engineering means that it is vulnerable to accusations that it promotes technocratic interpretations of human problems. We conclude that, insofar as realistic evaluation adheres to the ontology of critical realism, it provides a sound contribution to EBP, but insofar as it rejects the critical turn of Bhaskar's realism, it replicates the technocratic tendencies inherent in EBP. © 2011 Blackwell Publishing Ltd.

  2. Development and validation of a realistic head model for EEG

    Science.gov (United States)

    Bangera, Nitin Bhalchandra

    The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients

  3. Simulating Longitudinal Brain MRIs with Known Volume Changes and Realistic Variations in Image Intensity.

    Science.gov (United States)

    Khanal, Bishesh; Ayache, Nicholas; Pennec, Xavier

    2017-01-01

    This paper presents a simulator tool that can simulate large databases of visually realistic longitudinal MRIs with known volume changes. The simulator is based on a previously proposed biophysical model of brain deformation due to atrophy in AD. In this work, we propose a novel way of reproducing realistic intensity variation in longitudinal brain MRIs, which is inspired by an approach used for the generation of synthetic cardiac sequence images. This approach combines a deformation field obtained from the biophysical model with a deformation field obtained by a non-rigid registration of two images. The combined deformation field is then used to simulate a new image with specified atrophy from the first image, but with the intensity characteristics of the second image. This allows to generate the realistic variations present in real longitudinal time-series of images, such as the independence of noise between two acquisitions and the potential presence of variable acquisition artifacts. Various options available in the simulator software are briefly explained in this paper. In addition, the software is released as an open-source repository. The availability of the software allows researchers to produce tailored databases of images with ground truth volume changes; we believe this will help developing more robust brain morphometry tools. Additionally, we believe that the scientific community can also use the software to further experiment with the proposed model, and add more complex models of brain deformation and atrophy generation.

  4. Statistics For Neuroscientists

    Directory of Open Access Journals (Sweden)

    Subbakrishna D.K

    2000-01-01

    Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.

  5. Using realist synthesis to understand the mechanisms of interprofessional teamwork in health and social care.

    Science.gov (United States)

    Hewitt, Gillian; Sims, Sarah; Harris, Ruth

    2014-11-01

    Realist synthesis offers a novel and innovative way to interrogate the large literature on interprofessional teamwork in health and social care teams. This article introduces realist synthesis and its approach to identifying and testing the underpinning processes (or "mechanisms") that make an intervention work, the contexts that trigger those mechanisms and their subsequent outcomes. A realist synthesis of the evidence on interprofessional teamwork is described. Thirteen mechanisms were identified in the synthesis and findings for one mechanism, called "Support and value" are presented in this paper. The evidence for the other twelve mechanisms ("collaboration and coordination", "pooling of resources", "individual learning", "role blurring", "efficient, open and equitable communication", "tactical communication", "shared responsibility and influence", "team behavioural norms", "shared responsibility and influence", "critically reviewing performance and decisions", "generating and implementing new ideas" and "leadership") are reported in a further three papers in this series. The "support and value" mechanism referred to the ways in which team members supported one another, respected other's skills and abilities and valued each other's contributions. "Support and value" was present in some, but far from all, teams and a number of contexts that explained this variation were identified. The article concludes with a discussion of the challenges and benefits of undertaking this realist synthesis.

  6. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    2005-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  7. Realist Stronghold in the Land of Thucydides? - Appraising and Resisting a Realist Tradition in Greece

    Directory of Open Access Journals (Sweden)

    Kyriakos Mikelis

    2015-10-01

    Full Text Available Given the integration of the discipline of International Relations in Greece into the global discipline since a few decades, the article addresses the reflection of the ‘realism in and for the globe’ question to this specific case. Although the argument doesn’t go as far as to ‘recover’ forgotten IR theorists or self-proclaimed realists, a geopolitical dimension of socio-economic thought during interwar addressed concerns which could be related to the intricacies of realpolitik. Then again at current times, certain scholars have been eager to maintain a firm stance in favor of realism, focusing on the work of ancient figures, especially Thucydides or Homer, and on questions of the offensive-defensive realism debate as well as on the connection with the English School, while others have offered fruitful insights matching the broad constructivist agenda. Overall, certain genuine arguments have appeared, reflecting diversified views about sovereignty and its function or mitigation.

  8. Arc Statistics

    CERN Document Server

    Meneghetti, M; Dahle, H; Limousin, M

    2013-01-01

    The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...

  9. Bayesian statistical modeling of spatially correlated error structure in atmospheric tracer inverse analysis

    Directory of Open Access Journals (Sweden)

    C. Mukherjee

    2011-01-01

    Full Text Available Inverse modeling applications in atmospheric chemistry are increasingly addressing the challenging statistical issues of data synthesis by adopting refined statistical analysis methods. This paper advances this line of research by addressing several central questions in inverse modeling, focusing specifically on Bayesian statistical computation. Motivated by problems of refining bottom-up estimates of source/sink fluxes of trace gas and aerosols based on increasingly high-resolution satellite retrievals of atmospheric chemical concentrations, we address head-on the need for integrating formal spatial statistical methods of residual error structure in global scale inversion models. We do this using analytically and computationally tractable spatial statistical models, know as conditional autoregressive spatial models, as components of a global inversion framework. We develop Markov chain Monte Carlo methods to explore and fit these spatial structures in an overall statistical framework that simultaneously estimates source fluxes. Additional aspects of the study extend the statistical framework to utilize priors in a more physically realistic manner, and to formally address and deal with missing data in satellite retrievals. We demonstrate the analysis in the context of inferring carbon monoxide (CO sources constrained by satellite retrievals of column CO from the Measurement of Pollution in the Troposphere (MOPITT instrument on the TERRA satellite, paying special attention to evaluating performance of the inverse approach using various statistical diagnostic metrics. This is developed using synthetic data generated to resemble MOPITT data to define a~proof-of-concept and model assessment, and then in analysis of real MOPITT data.

  10. Realistic Visualization of Virtual Views and Virtual Cinema

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer...... is then gaining consensus: the augmented actor. Fundamental concepts and examples of methods proposed for realistic view synthesis, based on the transfer of photorealism from reference photographs to novel views, will be presented. The application of methods for realistic image synthesis to virtual cinematography...

  11. Structural Properties of Realistic Cultural Space Distributions

    CERN Document Server

    Babeanu, Alexandru-Ionut; Garlaschelli, Diego

    2015-01-01

    An interesting sociophysical research problem consists of the compatibility between collective social behavior in the short term and cultural diversity in the long term. Recently, it has been shown that, when studying a model of short term collective behavior in parallel with one of long term cultural diversity, one is lead to the puzzling conclusion that the 2 aspects are mutually exclusive. However, the compatibility is restored when switching from the randomly generated cultural space distribution to an empirical one for specifying the initial conditions in those models. This calls for understanding the extent to which such a compatibility restoration is independent of the empirical data set, as well as the relevant structural properties of such data. Firstly, this work shows that the restoration patterns are largely robust across data sets. Secondly, it provides a possible mechanism explaining the restoration, for the special case when the cultural space is formulated only in terms of nominal variables. T...

  12. Realistic Silver Optical Constants for Plasmonics

    Science.gov (United States)

    Jiang, Yajie; Pillai, Supriya; Green, Martin A.

    2016-07-01

    Silver remains the preferred conductor for optical and near-infrared plasmonics. Many high-profile studies focus exclusively on performance simulation in such applications. Almost invariably, these use silver optical data either from Palik’s 1985 handbook or, more frequently, an earlier Johnson and Christy (J&C) tabulation. These data are inconsistent, making it difficult to ascertain the reliability of the simulations. The inconsistency stems from challenges in measuring representative properties of pristine silver, due to tarnishing on air exposure. We demonstrate techniques, including use of silicon-nitride membranes, to access the full capabilities of multiple-angle, spectrometric-ellipsometry to generate an improved data set, representative of overlayer-protected, freshly-deposited silver films on silicon-nitride and glass.

  13. Depth statistics

    OpenAIRE

    2012-01-01

    In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...

  14. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  15. A framework for creating realistic synthetic fluorescence microscopy image sequences

    CSIR Research Space (South Africa)

    Mabaso, M

    2016-02-01

    Full Text Available of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, Rome, Italy. 21-23 February, 2016 A Framework for Creating Realistic Synthetic Fluorescence Microscopy Image Sequences Matsilele Mabaso1, Daniel Withey1...

  16. Student Work Experience: A Realistic Approach to Merchandising Education.

    Science.gov (United States)

    Horridge, Patricia; And Others

    1980-01-01

    Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)

  17. Student Work Experience: A Realistic Approach to Merchandising Education.

    Science.gov (United States)

    Horridge, Patricia; And Others

    1980-01-01

    Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)

  18. Preliminary Study of Realistic Blast Impact on Cultured Brain Slices

    Science.gov (United States)

    2015-04-01

    hippocampal slice samples to better understand blast-induced brain damage. 15. SUBJECT TERMS RDX spheres , organotypic cultures of hippocampus, small...Preliminary Study of Realistic Blast Impact on Cultured Brain Slices by Thuvan Piehler, Rohan Banton, Lars Piehler, Richard Benjamin, Ray...Aberdeen Proving Ground, MD 21005-5066 ARL-TR-7197 April 2015 Preliminary Study of Realistic Blast Impact on Cultured Brain Slices Thuvan

  19. Three-neutron resonance trajectories for realistic interaction models

    CERN Document Server

    Lazauskas, R

    2005-01-01

    Three-neutron resonances are searched using realistic nucleon-nucleon interaction models. Resonance pole trajectories were explored by artificially binding three-neutron and then gradually removing additional interaction. The final pole positions for three-neutron states up to $|J|$=5/2 finish in the fourth energy quadrant with Re(E)$\\leqslant0$ before additional interaction is removed. This study shows that realistic nucleon-nucleon interaction models exclude possible existence of observable three-neutron resonances.

  20. Toward realistic gauge-Higgs grand unification

    Science.gov (United States)

    Furui, Atsushi; Hosotani, Yutaka; Yamatsu, Naoki

    2016-09-01

    The SO(11) gauge-Higgs grand unification in the Randall-Sundrum warped space is presented. The 4D Higgs field is identified as the zero mode of the fifth-dimensional component of the gauge potentials, or as the fluctuation mode of the Aharonov-Bohm phase θ along the fifth dimension. Fermions are introduced in the bulk in the spinor and vector representations of SO(11). SO(11) is broken to SO(4)×SO(6) by the orbifold boundary conditions, which is broken to SU2×U1×SU3 by a brane scalar. Evaluating the effective potential V(θ), we show that the electroweak symmetry is dynamically broken to U1. The quark-lepton masses are generated by the Hosotani mechanism and brane interactions, with which the observed mass spectrum is reproduced. Proton decay is forbidden thanks to the new fermion number conservation. It is pointed out that there appear light exotic fermions. The Higgs boson mass is determined with the quark-lepton masses given; however, it turns out to be smaller than the observed value.

  1. Toward Realistic Gauge-Higgs Grand Unification

    CERN Document Server

    Furui, Atsushi; Yamatsu, Naoki

    2016-01-01

    The $SO(11)$ gauge-Higgs grand unification in the Randall-Sundrum warped space is presented. The 4D Higgs field is identified as the zero mode of the fifth dimensional component of the gauge potentials, or as the fluctuation mode of the Aharonov-Bohm phase $\\theta_H$ along the fifth dimension. Fermions are introduced in the bulk in the spinor and vector representations of $SO(11)$. $SO(11)$ is broken to $SO(4) \\times SO(6)$ by the orbifold boundary conditions, which is broken to $SU(2)_L \\times U(1)_Y \\times SU(3)_C$ by a brane scalar. Evaluating the effective potential $V_{\\rm eff} (\\theta_H)$, we show that the electroweak symmetry is dynamically broken to $U(1)_{\\rm EM}$. The quark-lepton masses are generated by the Hosotani mechanism and brane interactions, with which the observed mass spectrum is reproduced. The proton decay is forbidden thanks to the new fermion number conservation. It is pointed out that there appear light exotic fermions. The Higgs boson mass is determined with the quark-lepton masses ...

  2. Towards a realistic F-theory GUT

    Science.gov (United States)

    Callaghan, James C.; King, Stephen F.; Leontaris, George K.; Ross, Graham G.

    2012-04-01

    We consider semi-local F-theory GUTs arising from a single E 8 point of local enhancement, leading to simple GUT groups based on E 6, SO(10) and SU(5) with SU(3), SU(4) and SU(5) spectral covers, respectively. Assuming the minimal {{Z}_2} monodromy, we determine the homology classes and the associated spectra after flux breaking for each case. Using these results we construct an E 6 based model that demonstrates, for the first time, that it is possible to construct a phenomenologically viable model which leads to the MSSM at low energies. The exotics that result from flux breaking all get a large mass when singlet fields acquire vacuum expectation values driven by D- and F-flatness. Due to the underlying GUT symmetry and the U(1)s descending from E 8, bare baryon- and lepton-number violating terms are forbidden up to and including dimension 5. As a result nucleon decay is naturally suppressed below present bounds. The μ-term is generated by non-perturbative U(1) breaking effects. After including the effect of flux and instanton corrections acceptable quark and charged lepton masses and mixing angles can be obtained. Neutrinos get a mass from the see-saw mechanism through their coupling to singlet neutrinos that acquire large Majorana mass as a result of the monodromy.

  3. Fluctuations of offshore wind generation: Statistical modelling

    DEFF Research Database (Denmark)

    Pinson, Pierre; Christensen, Lasse E.A.; Madsen, Henrik

    2007-01-01

    The magnitude of power fluctuations at large offshore wind farms has a significant impact on the control and management strategies of their power output. If focusing on the minute scale, one observes successive periods with smaller and larger power fluctuations. It seems that different regimes...... production averaged at a 1, 5, and 10-minute rate. The exercise consists in one-step ahead forecasting of these time-series with the various regime-switching models. It is shown that the MSAR model, for which the succession of regimes is represented by a hidden Markov chain, significantly outperforms...

  4. Controlling IP Falsifying Using Realistic Simulation

    Directory of Open Access Journals (Sweden)

    Govindavaram Madhusri

    2013-07-01

    Full Text Available The study of Internet-scale events such as worm proliferation, distributed denial-of-service attacks (DDoS,flash crowds, routing volatilities, and DNS attacks depend on the formation of all the networks thatgenerate or forward valid and malevolent traffic,The Distributed Denial of Services (DDoS attack is aserious threat to the valid use of the Internet. Forestalling mechanisms are disappointed by the ability ofattackers to steal, or spoof, the source addresses in IP packets. IP falsifying is still widespread in networkscanning and investigates, as well as denial of service floods.IDPFs can limit the falsifying capability ofattackers. Moreover, it works on a small number of candidate networks easily traceable, thus simplifyingthe reactive IP trace back process. However, this technique does not allow large number of networks,which is a common misapprehension for those unfamiliar with the practice. Current network simulatorscannot be used to study Internet-scale events. They are general-purpose, packet-level simulators thatreproduce too many details of network communication, which limits scalability. We propose to develop adistributed Internet simulator, with the following novel features. It will provide a built-in Internet model,including the topology, routing, link bandwidths and delays, Instead of being a general-purpose simulator,it will provide a common simulation core for traffic generation and message passing, on top of which wewill build separate modules that customize messages and level of simulation details for the event of interest.Customization modules will ensure that all and only the relevant details of the event of interest aresimulated, cutting down the simulation time. We will also provide an interface for new modulespecification, and for existing module modification, this will bring the Internet event simulation at thefingertips of all interested researchers. The simulator will promote research in worm detection and defense

  5. Multimodal person authentication on a smartphone under realistic conditions

    Science.gov (United States)

    Morris, Andrew C.; Jassim, Sabah; Sellahewa, Harin; Allano, Lorene; Ehlers, Johan; Wu, Dalei; Koreman, Jacques; Garcia-Salicetti, Sonia; Ly-Van, Bao; Dorizzi, Bernadette

    2006-05-01

    Verification of a person's identity by the combination of more than one biometric trait strongly increases the robustness of person authentication in real applications. This is particularly the case in applications involving signals of degraded quality, as for person authentication on mobile platforms. The context of mobility generates degradations of input signals due to the variety of environments encountered (ambient noise, lighting variations, etc.), while the sensors' lower quality further contributes to decrease in system performance. Our aim in this work is to combine traits from the three biometric modalities of speech, face and handwritten signature in a concrete application, performing non intrusive biometric verification on a personal mobile device (smartphone/PDA). Most available biometric databases have been acquired in more or less controlled environments, which makes it difficult to predict performance in a real application. Our experiments are performed on a database acquired on a PDA as part of the SecurePhone project (IST-2002-506883 project "Secure Contracts Signed by Mobile Phone"). This database contains 60 virtual subjects balanced in gender and age. Virtual subjects are obtained by coupling audio-visual signals from real English speaking subjects with signatures from other subjects captured on the touch screen of the PDA. Video data for the PDA database was recorded in 2 recording sessions separated by at least one week. Each session comprises 4 acquisition conditions: 2 indoor and 2 outdoor recordings (with in each case, a good and a degraded quality recording). Handwritten signatures were captured in one session in realistic conditions. Different scenarios of matching between training and test conditions are tested to measure the resistance of various fusion systems to different types of variability and different amounts of enrolment data.

  6. Statistical Neurodynamics.

    Science.gov (United States)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  7. The Statistical Drake Equation

    Science.gov (United States)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  8. Numerical Modeling of Plasmonic Nanoantennas with Realistic 3D Roughness and Distortion

    Directory of Open Access Journals (Sweden)

    Vladimir P. Drachev

    2011-07-01

    Full Text Available Nanostructured plasmonic metamaterials, including optical nanoantenna arrays, are important for advanced optical sensing and imaging applications including surface-enhanced fluorescence, chemiluminescence, and Raman scattering. Although designs typically use ideally smooth geometries, realistic nanoantennas have nonzero roughness, which typically results in a modified enhancement factor that should be involved in their design. Herein we aim to treat roughness by introducing a realistic roughened geometry into the finite element (FE model. Even if the roughness does not result in significant loss, it does result in a spectral shift and inhomogeneous broadening of the resonance, which could be critical when fitting the FE simulations of plasmonic nanoantennas to experiments. Moreover, the proposed approach could be applied to any model, whether mechanical, acoustic, electromagnetic, thermal, etc, in order to simulate a given roughness-generated physical phenomenon.

  9. A realistic polarizing Sagnac topology with DC readout for the Einstein Telescope

    CERN Document Server

    Wang, Mengyao; Brown, Daniel; Brueckner, Frank; Carbone, Ludovico; Palmer, Rebecca; Freise, Andreas

    2013-01-01

    The Einstein Telescope (ET) is a proposed future gravitational wave detector. Its design is original, using a triangular orientation of three detectors and a xylophone configuration, splitting each detector into one high-frequency and one low-frequency system. In other aspects the current design retains the dual-recycled Michelson interferometer typical of current detectors, such as Advanced LIGO. In this paper, we investigate the feasibility of replacing the low-frequency part of the ET detectors with a Sagnac interferometer. We show that a Sagnac interferometer, using realistic optical parameters based on the ET design, could provide a similar level of radiation pressure noise suppression without the need for a signal recycling mirror and the extensive filter cavities. We consider the practical issues of a realistic, power-recycled Sagnac, using linear arm cavities and polarizing optics. In particular we investigate the effects of non-perfect polarizing optics and propose a new method for the generation of ...

  10. A realistic theory of health sector management. The case for critical realism.

    Science.gov (United States)

    Connelly, J

    2000-01-01

    To date the practice of health sector management has not been sufficiently theorised. An adequate theory should be able to answer the pre-eminent critique of managerial rationality and ethics mounted by Alasdair MacIntyre in After Virtue and should also offer robust analytical and ethical resources to identify and engage with the social, political, economic and moral issues underlying health sector management. Critical realism with its ontology of generative mechanisms, agency-structure relationships, valorisation of activity and ideology critique offers such resources in an empirically orientated but adequately theorised realist framework. Rather than negate MacIntyre, critical realism incorporates and transcends his key arguments regarding the rationality and ethics of management. This article introduces the main elements of critical realism and clears a conceptual space for the cumulation of critical realist case-studies and managerial craft knowledge.

  11. Statistical characterization of wave propagation in mine environments

    KAUST Repository

    Bakir, Onur

    2012-07-01

    A computational framework for statistically characterizing electromagnetic (EM) wave propagation through mine tunnels and galleries is presented. The framework combines a multi-element probabilistic collocation (ME-PC) method with a novel domain-decomposition (DD) integral equation-based EM simulator to obtain statistics of electric fields due to wireless transmitters in realistic mine environments. © 2012 IEEE.

  12. Is the number of registered abortions in Serbia realistic?

    Directory of Open Access Journals (Sweden)

    Rašević Mirjana

    2008-01-01

    most cases induced abortions performed in private health clinics are not included. Moreover, nurses, namely doctors often experience the filling out of prescribed forms for registration of fetal death as unnecessary, imposed, an additional obligation, without sensing the meaning and not understanding the significance of data as such. The abortion problem in Serbia is serious, complex and demands solving. This assumes the carrying out of many measures, including solving the matter of induced abortions registration. Determination of the realistic number of abortions in a community is very important, because in that way attention is drawn to this health and social problem and enables evaluation of actions to be taken for its alleviation. It remains that the state is to pay due attention to the problem of abortions in Serbia and to put private health clinics in which gynecologists perform abortions under control, as well as to promote the role and significance of statistics among health workers. .

  13. Performance Analysis of Relays in LTE for a Realistic Suburban Deployment Scenario

    DEFF Research Database (Denmark)

    Coletti, Claudio; Mogensen, Preben; Irmer, Ralf

    2011-01-01

    -effectiveness have been widely investigated in the literature, considering mainly statistical deployment scenarios, like regular networks with uniform traffic distribution. This paper is envisaged to illustrate the performances of different relay technologies (In-Band/Out-band) in a realistic suburban network......Relays are likely to play an important role in the deployment of Beyond 3G networks, such as LTE-Advanced, thanks to the possibility of effectively extending Macro network coverage and fulfilling the expected high data-rate requirements. Up until now, the relay technology potential and its cost...... of a lower frequency carrier at the Macro layer guarantee better network coverage and capacity improvements....

  14. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al. Th...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....

  15. Universal Grammar, statistics or both?

    Science.gov (United States)

    Yang, Charles D

    2004-10-01

    Recent demonstrations of statistical learning in infants have reinvigorated the innateness versus learning debate in language acquisition. This article addresses these issues from both computational and developmental perspectives. First, I argue that statistical learning using transitional probabilities cannot reliably segment words when scaled to a realistic setting (e.g. child-directed English). To be successful, it must be constrained by knowledge of phonological structure. Then, turning to the bona fide theory of innateness--the Principles and Parameters framework--I argue that a full explanation of children's grammar development must abandon the domain-specific learning model of triggering, in favor of probabilistic learning mechanisms that might be domain-general but nevertheless operate in the domain-specific space of syntactic parameters.

  16. Comparative study of the effectiveness of three learning environments: Hyper-realistic virtual simulations, traditional schematic simulations and traditional laboratory

    Directory of Open Access Journals (Sweden)

    Maria Isabel Suero

    2011-10-01

    Full Text Available This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output. This new virtual environment concept, which we call hyper-realistic, transcends basic schematic simulation; it provides the user with a more realistic perception of a physical phenomenon being simulated. We compared the learning achievements of three equivalent, homogeneous groups of undergraduates—an experimental group who used only the hyper-realistic virtual laboratory, a first control group who used a schematic simulation, and a second control group who used the traditional laboratory. The three groups received the same theoretical preparation and carried out equivalent practicals in their respective learning environments. The topic chosen for the experiment was optical aberrations. An analysis of variance applied to the data of the study demonstrated a statistically significant difference (p value <0.05 between the three groups. The learning achievements attained by the group using the hyper-realistic virtual environment were 6.1 percentage points higher than those for the group using the traditional schematic simulations and 9.5 percentage points higher than those for the group using the traditional laboratory.

  17. Multisite and multivariable statistical downscaling using a Gaussian copula quantile regression model

    Science.gov (United States)

    Ben Alaya, M. A.; Chebana, F.; Ouarda, T. B. M. J.

    2016-09-01

    Statistical downscaling techniques are required to refine atmosphere-ocean global climate data and provide reliable meteorological information such as a realistic temporal variability and relationships between sites and variables in a changing climate. To this end, the present paper introduces a modular structure combining two statistical tools of increasing interest during the last years: (1) Gaussian copula and (2) quantile regression. The quantile regression tool is employed to specify the entire conditional distribution of downscaled variables and to address the limitations of traditional regression-based approaches whereas the Gaussian copula is performed to describe and preserve the dependence between both variables and sites. A case study based on precipitation and maximum and minimum temperatures from the province of Quebec, Canada, is used to evaluate the performance of the proposed model. Obtained results suggest that this approach is capable of generating series with realistic correlation structures and temporal variability. Furthermore, the proposed model performed better than a classical multisite multivariate statistical downscaling model for most evaluation criteria.

  18. Radiative neutron capture: Hauser Feshbach vs. statistical resonances

    Science.gov (United States)

    Rochman, D.; Goriely, S.; Koning, A. J.; Ferroukhi, H.

    2017-01-01

    The radiative neutron capture rates for isotopes of astrophysical interest are commonly calculated on the basis of the statistical Hauser Feshbach (HF) reaction model, leading to smooth and monotonically varying temperature-dependent Maxwellian-averaged cross sections (MACS). The HF approximation is known to be valid if the number of resonances in the compound system is relatively high. However, such a condition is hardly fulfilled for keV neutrons captured on light or exotic neutron-rich nuclei. For this reason, a different procedure is proposed here, based on the generation of statistical resonances. This novel technique, called the "High Fidelity Resonance" (HFR) method is shown to provide similar results as the HF approach for nuclei with a high level density but to deviate and be more realistic than HF predictions for light and neutron-rich nuclei or at relatively low sub-keV energies. The MACS derived with the HFR method are systematically compared with the traditional HF calculations for some 3300 neutron-rich nuclei and shown to give rise to significantly larger predictions with respect to the HF approach at energies of astrophysical relevance. For this reason, the HF approach should not be applied to light or neutron-rich nuclei. The Doppler broadening of the generated resonances is also studied and found to have a negligible impact on the calculated MACS.

  19. Radiative neutron capture: Hauser Feshbach vs. statistical resonances

    Directory of Open Access Journals (Sweden)

    D. Rochman

    2017-01-01

    Full Text Available The radiative neutron capture rates for isotopes of astrophysical interest are commonly calculated on the basis of the statistical Hauser Feshbach (HF reaction model, leading to smooth and monotonically varying temperature-dependent Maxwellian-averaged cross sections (MACS. The HF approximation is known to be valid if the number of resonances in the compound system is relatively high. However, such a condition is hardly fulfilled for keV neutrons captured on light or exotic neutron-rich nuclei. For this reason, a different procedure is proposed here, based on the generation of statistical resonances. This novel technique, called the “High Fidelity Resonance” (HFR method is shown to provide similar results as the HF approach for nuclei with a high level density but to deviate and be more realistic than HF predictions for light and neutron-rich nuclei or at relatively low sub-keV energies. The MACS derived with the HFR method are systematically compared with the traditional HF calculations for some 3300 neutron-rich nuclei and shown to give rise to significantly larger predictions with respect to the HF approach at energies of astrophysical relevance. For this reason, the HF approach should not be applied to light or neutron-rich nuclei. The Doppler broadening of the generated resonances is also studied and found to have a negligible impact on the calculated MACS.

  20. The construction of context-mechanisms-outcomes in realistic evaluation.

    Science.gov (United States)

    Linsley, Paul; Howard, David; Owen, Sara

    2015-01-01

    To discuss the construction of context-mechanisms-outcomes (CMOs) developed as part of a realistic evaluation study of two aggression management training programmes. Realistic evaluation draws on theories and methods derived from the social sciences. It provides a distinctive account of the nature of programmes and how they work. Realistic evaluation is a form of evaluation that is driven by theory, and was based by Pawson and Tilley ( 1997 ) on the philosophy of critical realism. Critical realism is an important perspective in modern philosophy and social science, but it is largely absent in the field of healthcare research. This paper provides a critical discussion on the construction of CMOs as part of a realistic evaluation study. This paper draws on the personal experiences of the author in using realistic evaluation to evaluate training in aggression management. Realistic evaluation stresses four key linked concepts for explaining and understanding programmes: 'mechanism', 'context', 'outcome pattern' and 'context-mechanisms-outcomes (CMO) pattern configuration'. A CMO configuration is a proposition stating what it is about an initiative that works, for whom and in what circumstances. In this way, the effectiveness of the programme is understood, with an explanation of why the outcomes developed as they did and how the programme was able to react to underlying mechanisms and in what contexts. Therefore, a realistic evaluation researcher is not just inspecting outcomes to see if an initiative (implementation) works, but is analysing the outcomes to discover if the conjectured mechanism or context theories are confirmed. This analysis provides not only evidence of effectiveness, but also an explanation that helps to develop and improve the content and the targeting of future programmes. The development of CMOs requires a great deal of skill on the part of the researcher and requires a flexibility of approach when collecting and analysing the data and in

  1. Micromechanical Modeling of Fiber-Reinforced Composites with Statistically Equivalent Random Fiber Distribution

    Directory of Open Access Journals (Sweden)

    Wenzhi Wang

    2016-07-01

    Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.

  2. Statistical analysis plan for the WOMAN-ETAPlaT study: Effect of tranexamic acid on platelet function and thrombin generation [version 1; referees: 2 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Kastriot Dallaku

    2016-12-01

    Full Text Available Background. Postpartum haemorrhage (PPH is a potentially life-threatening complication for women, and the leading cause of maternal mortality. Tranexamic acid (TXA is an antifibrinolytic used worldwide to treat uterine haemorrhage and to reduce blood loss in general surgery. TXA may have effects on thrombin generation, platelet function and coagulation factors as a result of its inhibition on the plasmin.   Methods. WOMAN ETAPlaT is a sub-study of the World Maternal Antifibrinolitic trial (WOMAN trial. All adult women clinically diagnosed with PPH after a vaginal delivery or caesarean section, are eligible for inclusion in the study. Blood samples will be collected at the baseline and 30 minutes after the first dose of study treatment is given. Platelet function will be evaluated in whole blood immediately after sampling with Multiplate® tests (ADPtest and TRAPtest. Thrombin generation, fibrinogen, D-dimer, and coagulation factors vW, V and VIII will be analysed using platelet poor plasma.   Results. Recruitment to WOMAN ETAPlaT started on 04 November 2013 and closed on 13 January 2015, during this time  188 patients were recruited. The final participant follow-up was completed on 04 March 2015. This article introduces the statistical analysis plan for the study, without reference to unblinded data.   Conclusion. The data from this study will provide evidence for the effect of TXA on thrombin generation, platelet function and coagulation factors in women with PPH.   Trial registration: ClinicalTrials.gov Identifier: NCT00872469; ISRCTN76912190

  3. Statistical analysis plan for the WOMAN-ETAPlaT study: Effect of tranexamic acid on platelet function and thrombin generation [version 2; referees: 2 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Kastriot Dallaku

    2017-06-01

    Full Text Available Background. Postpartum haemorrhage (PPH is a potentially life-threatening complication for women, and the leading cause of maternal mortality. Tranexamic acid (TXA is an antifibrinolytic used worldwide to treat uterine haemorrhage and to reduce blood loss in general surgery. TXA may have effects on thrombin generation, platelet function and coagulation factors as a result of its inhibition on the plasmin.   Methods. WOMAN ETAPlaT is a sub-study of the World Maternal Antifibrinolitic trial (WOMAN trial. All adult women clinically diagnosed with PPH after a vaginal delivery or caesarean section, are eligible for inclusion in the study. Blood samples will be collected at the baseline and 30 minutes after the first dose of study treatment is given. Platelet function will be evaluated in whole blood immediately after sampling with Multiplate® tests (ADPtest and TRAPtest. Thrombin generation, fibrinogen, D-dimer, and coagulation factors vW, V and VIII will be analysed using platelet poor plasma.   Results. Recruitment to WOMAN ETAPlaT started on 04 November 2013 and closed on 13 January 2015, during this time  188 patients were recruited. The final participant follow-up was completed on 04 March 2015. This article introduces the statistical analysis plan for the study, without reference to unblinded data.   Conclusion. The data from this study will provide evidence for the effect of TXA on thrombin generation, platelet function and coagulation factors in women with PPH.   Trial registration: ClinicalTrials.gov Identifier: NCT00872469; ISRCTN76912190

  4. ZERODUR strength modeling with Weibull statistical distributions

    Science.gov (United States)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a

  5. Breaking with fun, educational and realistic learning games

    DEFF Research Database (Denmark)

    Duus Henriksen, Thomas

    2009-01-01

    between the game and other didactic activities that formed the learning process; and, the game might have been intended to be realistic, but it was in the gaps where this realism was critically assessed that learned understanding was forged. While thinking learning games as fun, educative and realistic......This paper addresses the game conceptions and values that learning games inherit from regular gaming, as well as how they affect the use and development of learning games. Its key points concern the issues of thinking learning games as fun, educative and realistic, which is how learning games...... are commonly conceived as means for staging learning processes, and that thinking learning games so has an inhibiting effect in regard to creating learning processes. The paper draws upon a qualitative study of participants' experiences with ‘the EIS Simulation', which is a computer-based learning game...

  6. Principles of maximally classical and maximally realistic quantum mechanics

    Indian Academy of Sciences (India)

    S M Roy

    2002-08-01

    Recently Auberson, Mahoux, Roy and Singh have proved a long standing conjecture of Roy and Singh: In 2-dimensional phase space, a maximally realistic quantum mechanics can have quantum probabilities of no more than + 1 complete commuting cets (CCS) of observables coexisting as marginals of one positive phase space density. Here I formulate a stationary principle which gives a nonperturbative definition of a maximally classical as well as maximally realistic phase space density. I show that the maximally classical trajectories are in fact exactly classical in the simple examples of coherent states and bound states of an oscillator and Gaussian free particle states. In contrast, it is known that the de Broglie–Bohm realistic theory gives highly nonclassical trajectories.

  7. 2 types of spicules "observed" in 3D realistic models

    CERN Document Server

    Martínez-Sykora, Juan

    2010-01-01

    Realistic numerical 3D models of the outer solar atmosphere show two different kind of spicule-like phenomena, as also observed on the solar limb. The numerical models are calculated using the 2 types of spicules "observed" in 3D realistic models Oslo Staggered Code (OSC) to solve the full MHD equations with non-grey and NLTE radiative transfer and thermal conduction along the magnetic field lines. The two types of spicules arise as a natural result of the dynamical evolution in the models. We discuss the different properties of these two types of spicules, their differences from observed spicules and what needs to be improved in the models.

  8. Putting a Realistic Theory of Mind into Agency Theory

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Stea, Diego

    2014-01-01

    concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management......Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... of rewards, and trace implications for value creation in principal-agent relations....

  9. The effects of realistic pancake solenoids on particle transport

    Energy Technology Data Exchange (ETDEWEB)

    Gu, X.; Okamura, M.; Pikin, A.; Fischer, W.; Luo, Y.

    2011-02-01

    Solenoids are widely used to transport or focus particle beams. Usually, they are assumed as being ideal solenoids with a high axial-symmetry magnetic field. Using the Vector Field Opera program, we modeled asymmetrical solenoids with realistic geometry defects, caused by finite conductor and current jumpers. Their multipole magnetic components were analyzed with the Fourier fit method; we present some possible optimized methods for them. We also discuss the effects of 'realistic' solenoids on low energy particle transport. The finding in this paper may be applicable to some lower energy particle transport system design.

  10. Evaluating Adverse Effects of Inhaled Nanoparticles by Realistic In Vitro Technology.

    Science.gov (United States)

    Geiser, Marianne; Jeannet, Natalie; Fierz, Martin; Burtscher, Heinz

    2017-02-22

    The number of daily products containing nanoparticles (NP) is rapidly increasing. NP in powders, dispersions, or sprays are a yet unknown risk for incidental exposure, especially at workplaces during NP production and processing, and for consumers of any health status and age using NP containing sprays. We developed the nano aerosol chamber for in vitro toxicity (NACIVT), a portable instrument for realistic safety testing of inhaled NP in vitro and evaluated effects of silver (Ag) and carbon (C) NP-which belong to the most widely used nanomaterials-on normal and compromised airway epithelia. We review the development, physical performance, and suitability of NACIVT for short and long-term exposures with air-liquid interface (ALI) cell cultures in regard to the prerequisites of a realistic in vitro test system for inhalation toxicology and in comparison to other commercially available, well characterized systems. We also review doses applied to cell cultures in vitro and acknowledge that a single exposure to realistic doses of spark generated 20-nm Ag- or CNP results in small, similar cellular responses to both NP types and that cytokine release generally increased with increasing NP dose.

  11. Protocol: realist synthesis of the impact of unemployment insurance policies on poverty and health.

    Science.gov (United States)

    Molnar, Agnes; O'Campo, Patricia; Ng, Edwin; Mitchell, Christiane; Muntaner, Carles; Renahy, Emilie; St John, Alexander; Shankardass, Ketan

    2015-02-01

    Unemployment insurance is an important social protection policy that buffers unemployed workers against poverty and poor health. Most unemployment insurance studies focus on whether increases in unemployment insurance generosity are predictive of poverty and health outcomes. Less work has used theory-driven approaches to understand and explain how and why unemployment insurance works, for whom, and under what circumstances. Given this, we present a realist synthesis protocol that seeks to unpack how contextual influences trigger relevant mechanisms to generate poverty and health outcomes. In this protocol, we conceptualize unemployment insurance as a key social protection policy; provide a supporting rationale on the need for a realist synthesis; and describe our process on identifying context-mechanism-outcome pattern configurations. Six methodological steps are described: initial theory development, search strategy; selection and appraisal of documents; data extraction; analysis and synthesis process; and presentation and dissemination of revised theory. Our forthcoming realist synthesis will be the first to build and test theory on the intended and unintended outcomes of unemployment insurance policies. Anticipated findings will allow policymakers to move beyond 'black box' approaches to consider 'mechanism-based' explanations that explicate the logic on how and why unemployment insurance matters.

  12. Evaluating Adverse Effects of Inhaled Nanoparticles by Realistic In Vitro Technology

    Science.gov (United States)

    Geiser, Marianne; Jeannet, Natalie; Fierz, Martin; Burtscher, Heinz

    2017-01-01

    The number of daily products containing nanoparticles (NP) is rapidly increasing. NP in powders, dispersions, or sprays are a yet unknown risk for incidental exposure, especially at workplaces during NP production and processing, and for consumers of any health status and age using NP containing sprays. We developed the nano aerosol chamber for in vitro toxicity (NACIVT), a portable instrument for realistic safety testing of inhaled NP in vitro and evaluated effects of silver (Ag) and carbon (C) NP—which belong to the most widely used nanomaterials—on normal and compromised airway epithelia. We review the development, physical performance, and suitability of NACIVT for short and long-term exposures with air-liquid interface (ALI) cell cultures in regard to the prerequisites of a realistic in vitro test system for inhalation toxicology and in comparison to other commercially available, well characterized systems. We also review doses applied to cell cultures in vitro and acknowledge that a single exposure to realistic doses of spark generated 20-nm Ag- or CNP results in small, similar cellular responses to both NP types and that cytokine release generally increased with increasing NP dose. PMID:28336883

  13. Uncertainty Quantification of Tracer Dispersion with the PMVP Model under Realistic Conditions

    Science.gov (United States)

    Meyer, D. W.; Duenser, S.

    2015-12-01

    The polar Markovian velocity process (PVMP) model provides a computationally efficient method to propagate input uncertainty stemming from unknown permeability fields to output flow and transport statistics [Meyer and Tchelepi, WRR, 2010; Meyer, Jenny, and Tchelepi, WRR, 2010; Meyer et al., WRR, 2013]. Compared with classical Monte Carlo (MC) sampling, the PMVP model provides predictions of tracer concentration statistics at computing times that are three orders of magnitude smaller. Consequently, the PMVP model is as well significantly faster than accelerated sampling techniques such as multi-level MC or polynomial chaos expansions. In this work, we further evaluate the PMVP model performance by applying the model for tracer dispersion predictions in a setup derived from the well-known MADE field experiment [Boggs et al., WRR, 1992]. We perform detailed model validations against reference MC simulations and conclude that the model provides overall accurate dispersion predictions under realistic conditions.

  14. Ion size effects on the electric double layer of a spherical particle in a realistic salt-free concentrated suspension.

    Science.gov (United States)

    Roa, Rafael; Carrique, Félix; Ruiz-Reina, Emilio

    2011-05-28

    A new modified Poisson-Boltzmann equation accounting for the finite size of the ions valid for realistic salt-free concentrated suspensions has been derived, extending the formalism developed for pure salt-free suspensions [Roa et al., Phys. Chem. Chem. Phys., 2011, 13, 3960-3968] to real experimental conditions. These realistic suspensions include water dissociation ions and those generated by atmospheric carbon dioxide contamination, in addition to the added counterions released by the particles to the solution. The electric potential at the particle surface will be calculated for different ion sizes and compared with classical Poisson-Boltzmann predictions for point-like ions, as a function of particle charge and volume fraction. The realistic predictions turn out to be essential to achieve a closer picture of real salt-free suspensions, and even more important when ionic size effects are incorporated to the electric double layer description. We think that both corrections have to be taken into account when developing new realistic electrokinetic models, and surely will help in the comparison with experiments for low-salt or realistic salt-free systems. This journal is © the Owner Societies 2011

  15. Boundary conditions towards realistic simulation of jet engine noise

    Science.gov (United States)

    Dhamankar, Nitin S.

    Strict noise regulations at major airports and increasing environmental concerns have made prediction and attenuation of jet noise an active research topic. Large eddy simulation coupled with computational aeroacoustics has the potential to be a significant research tool for this problem. With the emergence of petascale computer clusters, it is now computationally feasible to include the nozzle geometry in jet noise simulations. In high Reynolds number experiments on jet noise, the turbulent boundary layer on the inner surface of the nozzle separates into a turbulent free shear layer. Inclusion of a nozzle with turbulent inlet conditions is necessary to simulate this phenomenon realistically. This will allow a reasonable comparison of numerically computed noise levels with the experimental results. Two viscous wall boundary conditions are implemented for modeling the nozzle walls. A characteristic-based approach is compared with a computationally cheaper, extrapolation-based formulation. In viscous flow over a circular cylinder under two different regimes, excellent agreement is observed between the results of the two approaches. The results agree reasonably well with reference experimental and numerical results. Both the boundary conditions are thus found to be appropriate, the extrapolation-based formulation having an edge with its low cost. This is followed with the crucial step of generation of a turbulent boundary layer inside the nozzle. A digital filter-based turbulent inflow condition, extended in a new way to non-uniform curvilinear grids is implemented to achieve this. A zero pressure gradient flat plate turbulent boundary layer is simulated at a high Reynolds number to show that the method is capable of producing sustained turbulence. The length of the adjustment region necessary for synthetic inlet turbulence to recover from modeling errors is estimated. A low Reynolds number jet simulation including a round nozzle geometry is performed and the method

  16. Microscopic calculations of elastic scattering between light nuclei based on a realistic nuclear interaction

    Energy Technology Data Exchange (ETDEWEB)

    Dohet-Eraly, Jeremy [F.R.S.-FNRS (Belgium); Sparenberg, Jean-Marc; Baye, Daniel, E-mail: jdoheter@ulb.ac.be, E-mail: jmspar@ulb.ac.be, E-mail: dbaye@ulb.ac.be [Physique Nucleaire et Physique Quantique, CP229, Universite Libre de Bruxelles (ULB), B-1050 Brussels (Belgium)

    2011-09-16

    The elastic phase shifts for the {alpha} + {alpha} and {alpha} + {sup 3}He collisions are calculated in a cluster approach by the Generator Coordinate Method coupled with the Microscopic R-matrix Method. Two interactions are derived from the realistic Argonne potentials AV8' and AV18 with the Unitary Correlation Operator Method. With a specific adjustment of correlations on the {alpha} + {alpha} collision, the phase shifts for the {alpha} + {alpha} and {alpha} + {sup 3}He collisions agree rather well with experimental data.

  17. Application of Bayesian statistical decision theory to the optimization of generating set maintenance; Application de la theorie de decision statistique bayesienne a l`optimisation de la maintenance des groupes electrogenes

    Energy Technology Data Exchange (ETDEWEB)

    Procaccia, H.; Cordier, R.; Muller, S.

    1994-07-01

    Statistical decision theory could be a alternative for the optimization of preventive maintenance periodicity. In effect, this theory concerns the situation in which a decision maker has to make a choice between a set of reasonable decisions, and where the loss associated to a given decision depends on a probabilistic risk, called state of nature. In the case of maintenance optimization, the decisions to be analyzed are different periodicities proposed by the experts, given the observed feedback experience, the states of nature are the associated failure probabilities, and the losses are the expectations of the induced cost of maintenance and of consequences of the failures. As failure probabilities concern rare events, at the ultimate state of RCM analysis (failure of sub-component), and as expected foreseeable behaviour of equipment has to be evaluated by experts, Bayesian approach is successfully used to compute states of nature. In Bayesian decision theory, a prior distribution for failure probabilities is modeled from expert knowledge, and is combined with few stochastic information provided by feedback experience, giving a posterior distribution of failure probabilities. The optimized decision is the decision that minimizes the expected loss over the posterior distribution. This methodology has been applied to inspection and maintenance optimization of cylinders of diesel generator engines of 900 MW nuclear plants. In these plants, auxiliary electric power is supplied by 2 redundant diesel generators which are tested every 2 weeks during about 1 hour. Until now, during yearly refueling of each plant, one endoscopic inspection of diesel cylinders is performed, and every 5 operating years, all cylinders are replaced. RCM has shown that cylinder failures could be critical. So Bayesian decision theory has been applied, taking into account expert opinions, and possibility of aging when maintenance periodicity is extended. (authors). 8 refs., 5 figs., 1 tab.

  18. A Sense of Responsibility in Realistic Children's Fiction.

    Science.gov (United States)

    Rochelle, Warren

    1991-01-01

    Discusses the qualities of successful realistic children's fiction and examines issues involved in writing and publishing for children. A historical overview of the image of children in fiction is presented; dealing with sensitive, emotional topics is discussed; and concerns of editors and publishers of children's fiction are addressed. (12…

  19. Realistic Fiction and the Social Studies. Children's Literature.

    Science.gov (United States)

    Mitchell-Powell, Brenda, Ed.

    1995-01-01

    Asserts that children's literature is an effective tool to access and present sophisticated social studies concepts in the elementary classroom. Maintains that realistic fiction can integrate the social sciences with philosophy and religion. Presents a bibliographic essay including children's books and teacher resources. (CFR)

  20. Creating a Realistic Context for Team Projects in HCI

    NARCIS (Netherlands)

    Koppelman, H.; Dijk, van E.M.A.G.

    2006-01-01

    Team projects are nowadays common practice in HCI education. This paper focuses on the role of clients and users in team projects in introductory HCI courses. In order to provide projects with a realistic context we invite people from industry to serve as clients for the student teams. Some of them

  1. The Potential and Challenges of Critical Realist Ethnography

    Science.gov (United States)

    Barron, Ian

    2013-01-01

    This article revisits the critical realist ethnographic process that was adopted in my doctoral thesis, which was concerned with the experiences of ethnic identity of white British and Pakistani British children as they started kindergarten in the northwest of England. The article focuses on the ethnography that emerged from the visits that I…

  2. Hope in Janusz Korczak's Pedagogy of Realistic Idealism

    Science.gov (United States)

    Silverman, Marc

    2017-01-01

    This article explores the approach of "Realistic Idealism" to moral education developed by the humanist-progressive moral educator Janusz Korczak, and the role hope plays in it. This pair of terms seems to be an oxymoron. However, their employment is intentional and the article will demonstrate their dialectical interdependence:…

  3. "Shut My Mouth Wide Open": Realistic Fiction and Social Action.

    Science.gov (United States)

    Tyson, Cynthia A.

    1999-01-01

    Shares the responses of seven urban, male, African-American fifth graders to contemporary realistic fiction, discussing how the tying of this literature to the events in the boys' lives had the potential to move them toward social action. The paper examines the following: literature as a catalyst, reader responses to texts, critical literacy, and…

  4. Realistic glottal motion and airflow rate during human breathing.

    Science.gov (United States)

    Scheinherr, Adam; Bailly, Lucie; Boiron, Olivier; Lagier, Aude; Legou, Thierry; Pichelin, Marine; Caillibotte, Georges; Giovanni, Antoine

    2015-09-01

    The glottal geometry is a key factor in the aerosol delivery efficiency for treatment of lung diseases. However, while glottal vibrations were extensively studied during human phonation, the realistic glottal motion during breathing is poorly understood. Therefore, most current studies assume an idealized steady glottis in the context of respiratory dynamics, and thus neglect the flow unsteadiness related to this motion. This is particularly important to assess the aerosol transport mechanisms in upper airways. This article presents a clinical study conducted on 20 volunteers, to examine the realistic glottal motion during several breathing tasks. Nasofibroscopy was used to investigate the glottal geometrical variations simultaneously with accurate airflow rate measurements. In total, 144 breathing sequences of 30s were recorded. Regarding the whole database, two cases of glottal time-variations were found: "static" or "dynamic" ones. Typically, the peak value of glottal area during slow breathing narrowed from 217 ± 54 mm(2) (mean ± STD) during inspiration, to 178 ± 35 mm(2) during expiration. Considering flow unsteadiness, it is shown that the harmonic approximation of the airflow rate underevaluates the inertial effects as compared to realistic patterns, especially at the onset of the breathing cycle. These measurements provide input data to conduct realistic numerical simulations of laryngeal airflow and particle deposition. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  6. Realistic Fiction and the Social Studies. Children's Literature.

    Science.gov (United States)

    Mitchell-Powell, Brenda, Ed.

    1995-01-01

    Asserts that children's literature is an effective tool to access and present sophisticated social studies concepts in the elementary classroom. Maintains that realistic fiction can integrate the social sciences with philosophy and religion. Presents a bibliographic essay including children's books and teacher resources. (CFR)

  7. Representations of Adoption in Contemporary Realistic Fiction for Young Adults

    Science.gov (United States)

    Parsons, Sue Christian; Fuxa, Robin; Kander, Faryl; Hardy, Dana

    2017-01-01

    In this critical content analysis of thirty-seven contemporary realistic fiction books about adoption, the authors examine how adoption and adoptive families are depicted in young adult (YA) literature. The critical literacy theoretical frame brings into focus significant social implications of these depictions as the researchers illuminate and…

  8. A Sense of Responsibility in Realistic Children's Fiction.

    Science.gov (United States)

    Rochelle, Warren

    1991-01-01

    Discusses the qualities of successful realistic children's fiction and examines issues involved in writing and publishing for children. A historical overview of the image of children in fiction is presented; dealing with sensitive, emotional topics is discussed; and concerns of editors and publishers of children's fiction are addressed. (12…

  9. "Shut My Mouth Wide Open": Realistic Fiction and Social Action.

    Science.gov (United States)

    Tyson, Cynthia A.

    1999-01-01

    Shares the responses of seven urban, male, African-American fifth graders to contemporary realistic fiction, discussing how the tying of this literature to the events in the boys' lives had the potential to move them toward social action. The paper examines the following: literature as a catalyst, reader responses to texts, critical literacy, and…

  10. Highly realistic, immersive training for navy corpsmen: preliminary results.

    Science.gov (United States)

    Booth-Kewley, Stephanie; McWhorter, Stephanie K

    2014-12-01

    Highly realistic, immersive training has been developed for Navy corpsmen based on the success of the Infantry Immersion Trainer. This new training is built around scenarios that are designed to depict real-life, operational situations. Each scenario used in the training includes sights, sounds, smells, and distractions to simulate realistic and challenging combat situations. The primary objective of this study was to assess corpsmen participants' satisfaction with highly realistic training. The study sample consisted of 434 male Navy service members attending Field Medical Training Battalion-West, Camp Pendleton, California. Corpsmen participants completed surveys after receiving the training. Participants expressed high levels of satisfaction with the training overall and with several specific elements of the training. The element of the training that the corpsmen rated the highest was the use of live actors. The vast majority of the participants reported that the training had increased their overall confidence about being successful corpsmen and had strengthened their confidence in their ability to provide care under pressure. Additional research should extend highly realistic training to other military medical provider populations. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  11. Using Concrete and Realistic Data in Evaluating Initial Visualization Designs

    DEFF Research Database (Denmark)

    Knudsen, Søren; Pedersen, Jeppe Gerner; Herdal, Thor

    2016-01-01

    We explore means of designing and evaluating initial visualization ideas, with concrete and realistic data in cases where data is not readily available. Our approach is useful in exploring new domains and avenues for visualization, and contrasts other visualization work, which typically operate u...

  12. How outcomes are achieved through patient portals: A realist review

    NARCIS (Netherlands)

    E.T. Otte-Trojel (Eva Terese); A.A. de Bont (Antoinette); T.G. Rundall (Thomas); J.J. van de Klundert (Joris)

    2014-01-01

    textabstractObjective: To examine how patient portals contribute to health service delivery and patient outcomes. The specific aims were to examine how outcomes are produced, and how variations in outcomes can be explained. Methods: We used a realist review method, which aims to describe how 'an

  13. Realistic Visualization of Virtual Views and Virtual Cinema

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    2005-01-01

    Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of compute...

  14. Improving Mathematics Teaching in Kindergarten with Realistic Mathematical Education

    Science.gov (United States)

    Papadakis, Stamatios; Kalogiannakis, Michail; Zaranis, Nicholas

    2017-01-01

    The present study investigates and compares the influence of teaching Realistic Mathematics on the development of mathematical competence in kindergarten. The sample consisted of 231 Greek kindergarten students. For the implementation of the survey, we conducted an intervention, which included one experimental and one control group. Children in…

  15. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  16. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    the vast amounts of raw data. This task is tackled by computational tools implementing algorithms that match the experimental data to databases, providing the user with lists for downstream analysis. Several platforms for such automated interpretation of mass spectrometric data have been developed, each...... having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  17. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  18. Hydrostratigraphic modelling using multiple-point statistics and airborne transient electromagnetic methods

    DEFF Research Database (Denmark)

    Barfod, Adrian; Straubhaar, Julien; Høyer, Anne-Sophie

    2017-01-01

    Creating increasingly realistic hydrological models involves the inclusion of additional geological and geophysical data in the hydrostratigraphic modelling procedure. Using Multiple Point Statistics (MPS) for stochastic hydrostratigraphic modelling provides a degree of flexibility that allows th...

  19. Statistical properties of indicators of first-year performance at university

    African Journals Online (AJOL)

    Science and Numeracy Skills) have statistical distributions similar to that of average first- ... those admitted to the Arts, Theology, Law or Education Faculties wrote Test Battery 3. ..... predict average first-year university performance realistically.

  20. Development of a realistic, dynamic digital brain phantom for CT perfusion validation

    Science.gov (United States)

    Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2016-03-01

    Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.

  1. Realistic Modeling of Multi-Scale MHD Dynamics of the Solar Atmosphere

    Science.gov (United States)

    Kitiashvili, Irina; Mansour, Nagi N.; Wray, Alan; Couvidat, Sebastian; Yoon, Seokkwan; Kosovichev, Alexander

    2014-01-01

    Realistic 3D radiative MHD simulations open new perspectives for understanding the turbulent dynamics of the solar surface, its coupling to the atmosphere, and the physical mechanisms of generation and transport of non-thermal energy. Traditionally, plasma eruptions and wave phenomena in the solar atmosphere are modeled by prescribing artificial driving mechanisms using magnetic or gas pressure forces that might arise from magnetic field emergence or reconnection instabilities. In contrast, our 'ab initio' simulations provide a realistic description of solar dynamics naturally driven by solar energy flow. By simulating the upper convection zone and the solar atmosphere, we can investigate in detail the physical processes of turbulent magnetoconvection, generation and amplification of magnetic fields, excitation of MHD waves, and plasma eruptions. We present recent simulation results of the multi-scale dynamics of quiet-Sun regions, and energetic effects in the atmosphere and compare with observations. For the comparisons we calculate synthetic spectro-polarimetric data to model observational data of SDO, Hinode, and New Solar Telescope.

  2. Role-playing for more realistic technical skills training.

    Science.gov (United States)

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  3. Towards a Realistic Parsing of the Feynman Path Integral

    Directory of Open Access Journals (Sweden)

    Ken Wharton

    2016-01-01

    Full Text Available The Feynman path integral does not allow a one real path interpretation, because the quantum amplitudes contribute to probabilities in a non-separable manner. The opposite extreme, all paths happen, is not a useful or informative account. In this paper it is shown that an intermediate parsing of the path integral, into realistic non-interfering possibilities, is always available. Each realistic possibility formally corresponds to numerous particle paths, but is arguably best interpreted as a spacetime-valued field. Notably, one actual field history can always be said to occur, although it will generally not have an extremized action. The most obvious concerns with this approach are addressed, indicating necessary follow-up research. But without obvious showstoppers, it seems plausible that the path integral might be reinterpreted to explain quantum phenomena in terms of Lorentz covariant field histories.Quanta 2016; 5: 1–11.

  4. A general realistic treatment of the disk paradox

    CERN Document Server

    Pantazis, George

    2016-01-01

    Mechanical angular momentum is not conserved in systems involving electromagnetic fields with non-zero electromagnetic field angular momentum. Conservation is restored only if the total (mechanical and field) angular momentum is considered. Previous studies have investigated this effect, known as "Feynman's Electromagnetic Paradox" or simply "Disk Paradox" in the context of idealized systems (infinite or infinitesimal solenoids and charged cylinders \\etc). In the present analysis we generalize previous studies by considering more realistic systems with finite components and demonstrating explicitly the conservation of the total angular momentum. This is achieved by expressing both the mechanical and the field angular momentum in terms of charges and magnetic field fluxes through various system components. Using this general expression we demonstrate explicitly the conservation of total angular momentum in both idealized and realistic systems (finite solenoid concentric with two charged long cylinders) taking ...

  5. Bell's inequalities with realistic noise for polarization-entangled photons

    CERN Document Server

    Cabello, A; Lamas-Linares, A; Cabello, Adan; Feito, Alvaro; Lamas-Linares, Antia

    2005-01-01

    Contrary to the usual assumption that the experimental preparation of pure entangled states can be described by mixed states due to white noise, a more realistic description for polarization-entangled states produced by parametric down-conversion is that they are mixed states due to decoherence in a preferred polarization basis. This distinction between white and colored noise is crucial when we look for maximal violations of Bell's inequalities for two-qubit and two-qutrit entangled states. We find that violations of Bell's inequalities with realistic noise for polarization-entangled photons are extremely robust for colored noise, whereas this is not the case for white noise. In addition, we study the difference between white and colored noise for maximal violations of Bell's inequalities for three and four-qubit entangled states.

  6. Modeling of biofuel pellets torrefaction in a realistic geometry

    Directory of Open Access Journals (Sweden)

    Artiukhina Ekaterina

    2016-01-01

    Full Text Available Low temperature pyrolysis also known as torrefaction is considered as a promising pretreatment technology for conversion of biomass into a solid biofuel with enhanced properties in terms of lower moisture and volatile matter content, hydrophobicity and increased heating value. A thermal treatment leads to a non-uniform temperature field and chemical reactions proceeding unevenly within the pellets. However the temperature is assumed to be uniform in the pellets in the majority of models. Here we report on the model of single pellet biomass torrefaction, taking into account the heat transfer and chemical kinetics in the realistic geometry. The evolution of temperature and material density in the non-stationary thermo-chemical process is described by the system of non-linear partial differential equations. The model describing the high-temperature drying of biomass pellet was also introduced. The importance of boundary effects in realistic simulations of biomass pellets torrefaction is underlined in this work.

  7. Applying a realistic evaluation model to occupational safety interventions

    DEFF Research Database (Denmark)

    Pedersen, Louise Møller

    2017-01-01

    of occupational safety interventions. Conclusion: The revised realistic evaluation model can help safety science forward in identifying key factors for the success of occupational safety interventions. However, future research should strengthen the link between the immediate intervention results and outcome.......Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... characteristics of key actors (defined mechanisms), and the interplay between them, and can be categorized as expected or unexpected. However, little is known about ’how’ to include context and mechanisms in evaluations of intervention effectiveness. A revised realistic evaluation model has been introduced...

  8. Realistic simulation of reduced-dose CT with noise modeling and sinogram synthesis using DICOM CT images

    Energy Technology Data Exchange (ETDEWEB)

    Won Kim, Chang [Interdisciplinary Program of Bioengineering Major Seoul National University College of Engineering, San 56-1, Silim-dong, Gwanak-gu, Seoul 152-742, South Korea and Institute of Radiation Medicine, Seoul National University College of Medicine, 28, Yongon-dong, Chongno-gu, Seoul 110-744 (Korea, Republic of); Kim, Jong Hyo, E-mail: kimjhyo@snu.ac.kr [Department of Radiology, Institute of Radiation Medicine, Seoul National University College of Medicine, 28, Yongon-dong, Chongno-gu, Seoul, 110-744 (Korea, Republic of); Department of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon, Gyeonggi-do, 443-270 (Korea, Republic of); Advanced Institutes of Convergence Technology, Seoul National University, Suwon, Gyeonggi-do, 443-270 (Korea, Republic of)

    2014-01-15

    Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in

  9. Highly Realistic, Immersive Training for Navy Corpsmen: Preliminary Results

    Science.gov (United States)

    2014-12-01

    the actors who play the role of casualties are actual amputees , some of whom were injured in combat. The scenarios require that corpsmen demonstrate...black (9%), Hispanic (5%), and other race groups. A quarter of respon- dents (25%) marked “mixed or multiple races.” Measures To evaluate satisfaction...feel realistic Participants’ Suggestions for Improving the Training Give more scenarios and more time at MOUT Do the training multiple times, so that

  10. Music therapy for palliative care: A realist review.

    OpenAIRE

    McConnell, T; Porter, Samuel

    2016-01-01

    OBJECTIVE: Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. METHOD: We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therap...

  11. Music therapy for palliative care: A realist review

    OpenAIRE

    McConnell, T.; Porter, Samuel

    2016-01-01

    OBJECTIVE: Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. METHOD: We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therap...

  12. Simulation of human ischemic stroke in realistic 3D geometry

    Science.gov (United States)

    Dumont, Thierry; Duarte, Max; Descombes, Stéphane; Dronne, Marie-Aimée; Massot, Marc; Louvet, Violaine

    2013-06-01

    In silico research in medicine is thought to reduce the need for expensive clinical trials under the condition of reliable mathematical models and accurate and efficient numerical methods. In the present work, we tackle the numerical simulation of reaction-diffusion equations modeling human ischemic stroke. This problem induces peculiar difficulties like potentially large stiffness which stems from the broad spectrum of temporal scales in the nonlinear chemical source term as well as from the presence of steep spatial gradients in the reaction fronts, spatially very localized. Furthermore, simulations on realistic 3D geometries are mandatory in order to describe correctly this type of phenomenon. The main goal of this article is to obtain, for the first time, 3D simulations on realistic geometries and to show that the simulation results are consistent with those obtain in experimental studies or observed on MRI images in stroke patients. For this purpose, we introduce a new resolution strategy based mainly on time operator splitting that takes into account complex geometry coupled with a well-conceived parallelization strategy for shared memory architectures. We consider then a high order implicit time integration for the reaction and an explicit one for the diffusion term in order to build a time operator splitting scheme that exploits efficiently the special features of each problem. Thus, we aim at solving complete and realistic models including all time and space scales with conventional computing resources, that is on a reasonably powerful workstation. Consequently and as expected, 2D and also fully 3D numerical simulations of ischemic strokes for a realistic brain geometry, are conducted for the first time and shown to reproduce the dynamics observed on MRI images in stroke patients. Beyond this major step, in order to improve accuracy and computational efficiency of the simulations, we indicate how the present numerical strategy can be coupled with spatial

  13. An Argument Against the Realistic Interpretation of the Wave Function

    Science.gov (United States)

    Rovelli, Carlo

    2016-10-01

    Testable predictions of quantum mechanics are invariant under time reversal. But the evolution of the quantum state in time is not so, neither in the collapse nor in the no-collapse interpretations of the theory. This is a fact that challenges any realistic interpretation of the quantum state. On the other hand, this fact raises no difficulty if we interpret the quantum state as a mere calculation device, bookkeeping past real quantum events.

  14. Extending AI Planning to Solve more Realistic Problems

    OpenAIRE

    Zalaket, Joseph

    2008-01-01

    In this chapter, we have presented multiple extensions for classical planning algorithms in order to allow them to solve more realistic problems. This kind of problems can contain any type of knowledge and can require complex handling which is not yet supported by the existing planning algorithms. Some complicated problems can be expressed with the recent extensions to PDDL language, but the main lack remains especially because of the incapacity of the current planners. We have suggested and ...

  15. Investigation of the Airflow inside Realistic and Semi-Realistic Replicas of Human Airways

    Directory of Open Access Journals (Sweden)

    Lizal Frantisek

    2015-01-01

    Full Text Available Measurement of velocity in human lungs during breathing cycle is a challenging task for researchers, since the measuring location is accessible only with significant difficulties. A special measuring rig consisting of optically transparent replica of human lungs, breathing simulator, particle generator and Laser-Doppler anemometer was developed and used for investigation of the velocity in specific locations of lungs during simulated breathing cycle. Experiments were performed on two different replicas of human lungs in corresponding measuring points to facilitate the analysis of the influence of the geometry and its simplification on the flow. The analysis of velocity course and turbulence intensity revealed that special attention should be devoted to the modelling of vocal cords position during breathing, as the position of laryngeal jet created by vocal cords significantly influences velocity profiles in trachea. The shapes of velocity courses during expiration proved to be consistent for both replicas; however magnitudes of peak expiratory velocity differ between the corresponding measuring points in both the replicas.

  16. MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance.

    Science.gov (United States)

    Wissmann, Lukas; Santelli, Claudio; Segars, William P; Kozerke, Sebastian

    2014-08-20

    Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions.

  17. An anatomically realistic temperature phantom for radiofrequency heating measurements.

    Science.gov (United States)

    Graedel, Nadine N; Polimeni, Jonathan R; Guerin, Bastien; Gagoski, Borjan; Wald, Lawrence L

    2015-01-01

    An anthropomorphic phantom with realistic electrical properties allows for a more accurate reproduction of tissue current patterns during excitation. A temperature map can then probe the worst-case heating expected in the unperfused case. We describe an anatomically realistic human head phantom that allows rapid three-dimensional (3D) temperature mapping at 7T. The phantom was based on hand-labeled anatomical imaging data and consists of four compartments matching the corresponding human tissues in geometry and electrical properties. The increases in temperature resulting from radiofrequency excitation were measured with MR thermometry using a temperature-sensitive contrast agent (TmDOTMA(-)) validated by direct fiber optic temperature measurements. Acquisition of 3D temperature maps of the full phantom with a temperature accuracy better than 0.1°C was achieved with an isotropic resolution of 5 mm and acquisition times of 2-4 minutes. Our results demonstrate the feasibility of constructing anatomically realistic phantoms with complex geometries incorporating the ability to measure accurate temperature maps in the phantom. The anthropomorphic temperature phantom is expected to provide a useful tool for the evaluation of the heating effects of both conventional and parallel transmit pulses and help validate electromagnetic and temperature simulations. © 2014 Wiley Periodicals, Inc.

  18. Depigmented Skin and Phantom Color Measurements for Realistic Prostheses

    Science.gov (United States)

    Tanner, Paul; Leachman, Sancy; Boucher, Kenneth; Ozçelik, Tunçer Burak

    2013-01-01

    Purpose The purpose of this study was to test the hypothesis that regardless of human skin phototype, areas of depigmented skin, as seen in vitiligo, are optically indistinguishable among skin phototypes. The average of the depigmented skin measurements can be used to develop the base color of realistic prostheses. Methods and Materials Data from 20 of 32 recruited vitiligo study participants. Diffuse reflectance spectroscopy measurements were made from depigmented skin and adjacent pigmented skin, then compared to 66 pigmented polydimethylsiloxane phantoms to determine pigment concentrations in turbid media for making realistic facial prostheses. Results The Area Under spectral intensity Curve (AUC) was calculated for average spectroscopy measurements of pigmented sites in relation to skin phototype (p=0.0505) and depigmented skin in relation to skin phototype (p=0.59). No significant relationship exists between skin phototypes and depigmented skin spectroscopy measurements. The average of the depigmented skin measurements (AUC 19,129) was the closest match to phantom 6.4 (AUC 19,162) Conclusions Areas of depigmented skin are visibly indistinguishable per skin phototype, yet spectrometry shows that depigmented skin measurements varied and were unrelated to skin phototype. Possible sources of optical variation of depigmented skin include age, body site, blood flow, quantity/quality of collagen, and other chromophores. The average of all depigmented skin measurements can be used to derive the pigment composition and concentration for realistic facial prostheses. PMID:23750920

  19. Relating realist metatheory to issues of gender and mental health.

    Science.gov (United States)

    Bergin, M; Wells, John S G; Owen, Sara

    2010-06-01

    This paper seeks to advance the debate that considers critical realism as an alternative approach for understanding gender and mental health and its relatedness to mental health research and practice. The knowledge base of how 'sex' and 'gender' affect mental health and illness is expanding. However, the way we conceptualize gender is significant and challenging as quite often our ability to think about 'gender' as independent of 'sex' is not common. The influences and interplay of how sex (biological) and gender (social) affect mental health and illness requires consideration. Critical realism suggests a shared ontology and epistemology for the natural and social sciences. While much of the debate surrounding gender is guided within a constructivist discourse, an exploration of the concept 'gender' is reflected on and some key realist propositions are considered for mental health research and practice. This is achieved through the works of some key realist theorists. Critical realism offers potential for research and practice in relation to gender and mental health because it facilitates changes in our understanding, while simultaneously, not discarding that which is already known. In so doing, it allows the biological (sex) and social (gender) domains of knowledge for mental health and illness to coexist, without either being reduced to or defined by the other. Arguably, greater depth and explanations for gender and mental health issues are presented within a realist metatheory.

  20. Exposure render: an interactive photo-realistic volume rendering framework.

    Directory of Open Access Journals (Sweden)

    Thomas Kroes

    Full Text Available The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT, coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR. With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license.

  1. Behaviorly realistic simulations of stock market traders with a soul

    Science.gov (United States)

    Solomon, Sorin

    1999-09-01

    The price fluctuations of the stocks in the financial markets are the result of the individual operations by many individual investors. However for many decades the financial theory did not use directly this “microscopic representation” of the markets. The main difficulties preventing this approach were solved recently with the advent of modern computer technology: - massive detailed data on the individual market operations became available; - “microscopic simulations” of the stock markets in terms of their individual participating agents allow very realistic treatment of the problem. By taking advantage of the modern computer processing and simulation techniques, we are now able to confront real market data with the results of simulating “microscopic” realistic models of the markets. These models have the potential to include and study the effects on the market of any desired feature in the investors behavior: departures from rationality, herding effects, heterogeneous investor-specific trading strategies. We propose to use the comparison of computer simulations of microscopic models with the actual market data in order to validate and enhance the knowledge on the financial behavior of individuals. Moreover we hope to explain, understand (and may be predict and control) macroscopic market dynamical features (e.g., cycles of booms and crashes, investors wealth distribution, market returns probability distribution etc.) based on realistic models using this knowledge.

  2. Towards Performance Evaluation of Cognitive Radio Network in Realistic Environment

    Directory of Open Access Journals (Sweden)

    Vivek Kukreja

    2013-11-01

    Full Text Available The scarcity of free spectrum compels us to look for alternatives for ever increasing wireless applications. Cognitive Radios (CR is one such alternative that can solve this problem. The network nodes having CR capability is termed as Cognitive Radio Network (CRN. To have communication in CRN a routing protocol is required. The primary goal of which is to provide a route from source to destination. Various routing protocols have been proposed and tested in idealistic environment using simulation software such as NS-2 and QualNet. This paper is an effort in the same direction but the efficacy is evaluated in realistic conditions by designing a simulator in MATLAB-7. To make the network scenario realistic obstacles of different shapes, type, sizes and numbers have been introduced. In addition to that the shape of the periphery is also varied to find the impact of it on routing protocols. From the results it is observed that the outcomes in the realistic and idealistic vary significantly. The reason for the same has also been discussed in this paper.

  3. Fast Rendering of Realistic Virtual Character in Game Scene

    Directory of Open Access Journals (Sweden)

    Mengzhao Yang

    2013-07-01

    Full Text Available Human skin is made up of multiple translucent layers and rendering of skin appearance usually acquire complex modeling and massive calculation. In some practical applications such as 3D game development, we not only approximate the realistic looking skin but also develop efficient method to implement easily for meeting needs of real-time rendering. In this study, we solve the problem of wrap lighting and introduce a surface details approximation method to give realistic rendering of virtual character. Our method considers that different thicknesses of geometry on the skin surface can result in different scattering degree of incident light and so pre-calculate the diffuse falloff into a look-up texture. Also, we notice that scattering is strongly color dependent and small bumps are common on the skin surface and so pre-soften the finer details on the skin surface according to the R/G/B channel. At last, we linearly interpolate the diffuse lighting with different scattering degree from the look-up texture sampled with the curvature and NdotL. Experiment results show that the proposed approach yields realistic virtual character and obtains high frames per second in real-time rendering.

  4. Modeling short-term dynamics and variability for realistic interactive facial animation.

    Science.gov (United States)

    Stoiber, Nicolas; Breton, Gaspard; Seguier, Renaud

    2010-01-01

    Modern modeling and rendering techniques have produced nearly photorealistic face models, but truly expressive digital faces also require natural-looking movements. Virtual characters in today's applications often display unrealistic facial expressions. Indeed, facial animation with traditional schemes such as keyframing and motion capture demands expertise. Moreover, the traditional schemes aren't adapted to interactive applications that require the real-time generation of context-dependent movements. A new animation system produces realistic expressive facial motion at interactive speed. The system relies on a set of motion models controlling facial-expression dynamics. The models are fitted on captured motion data and therefore retain the dynamic signature of human facial expressions. They also contain a nondeterministic component that ensures the variety of the long-term visual behavior. This system can efficiently animate any synthetic face. The video illustrates interactive use of a system that generates facial-animation sequences.

  5. Plastic Surgery Statistics

    Science.gov (United States)

    ... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...

  6. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  7. Statistical Computing in Information Society

    Directory of Open Access Journals (Sweden)

    Domański Czesław

    2015-12-01

    Full Text Available In the presence of massive data coming with high heterogeneity we need to change our statistical thinking and statistical education in order to adapt both - classical statistics and software developments that address new challenges. Significant developments include open data, big data, data visualisation, and they are changing the nature of the evidence that is available, the ways in which it is presented and the skills needed for its interpretation. The amount of information is not the most important issue – the real challenge is the combination of the amount and the complexity of data. Moreover, a need arises to know how uncertain situations should be dealt with and what decisions should be taken when information is insufficient (which can also be observed for large datasets. In the paper we discuss the idea of computational statistics as a new approach to statistical teaching and we try to answer a question: how we can best prepare the next generation of statisticians.

  8. Adaptation and applications of a realistic digital phantom based on patient lung tumor trajectories

    Science.gov (United States)

    Mishra, Pankaj; St. James, Sara; Segars, W Paul; Berbeco, Ross I; Lewis, John H

    2012-01-01

    Digital phantoms continue to play a significant role in modeling and characterizing medical imaging. The currently available XCAT phantom incorporates both the flexibility of mathematical phantoms and the realistic nature of voxelized phantoms. This phantom generates images based on a regular breathing pattern and can include arbitrary lung tumor trajectories. In this work, we present an algorithm that modifies the current XCAT phantom to generate 4D imaging data based on irregular breathing. First, a parameter is added to the existing XCAT phantom to include any arbitrary tumor motion. This modification introduces the desired tumor motion but, comes at the cost of decoupled diaphragm, chest wall and lung motion. To remedy this problem diaphragm and chest wall motion is first modified based on initial tumor location and then input to the XCAT phantom. This generates a phantom with synchronized respiratory motion. Mapping of tumor motion trajectories to diaphragm and chest wall motion is done by adaptively calculating a scale factor based on tumor to lung contour distance. The distance is calculated by projecting the initial tumor location to lung edge contours characterized by quadratic polynomials. Data from 10 patients were used to evaluate the accuracy between actual independent tumor location and the location obtained from the modified XCAT phantom. The rmse and standard deviations for 10 patients in x, y, and z directions are: (0.29 ± 0.04, 0.54 ± 0.17, and 0.39 ± 0.06) mm. To demonstrate the utility of the phantom, we use the new phantom to simulate a 4DCT acquisition as well as a recently published method for phase sorting. The modified XCAT phantom can be used to generate more realistic imaging data for enhanced testing of algorithms for CT reconstruction, tumor tracking, and dose reconstruction. PMID:22595980

  9. A realistic evaluation: the case of protocol-based care

    Science.gov (United States)

    2010-01-01

    Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on

  10. Deep Generative Models of Galaxy Images for the Calibration of the Next Generation of Weak Lensing Surveys

    Science.gov (United States)

    Lanusse, Francois; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Schneider, Jeff; Poczos, Barnabas

    2017-01-01

    Weak gravitational lensing has long been identified as one of the most powerful probes to investigate the nature of dark energy. As such, weak lensing is at the heart of the next generation of cosmological surveys such as LSST, Euclid or WFIRST.One particularly crititcal source of systematic errors in these surveys comes from the shape measurement algorithms tasked with estimating galaxy shapes. GREAT3, the last community challenge to assess the quality of state-of-the-art shape measurement algorithms has in particular demonstrated that all current methods are biased to various degrees and, more importantly, that these biases depend on the details of the galaxy morphologies. These biases can be measured and calibrated by generating mock observations where a known lensing signal has been introduced and comparing the resulting measurements to the ground-truth. Producing these mock observations however requires input galaxy images of higher resolution and S/N than the simulated survey, which typically implies acquiring extremely expensive space-based observations.The goal of this work is to train a deep generative model on already available Hubble Space Telescope data which can then be used to sample new galaxy images conditioned on parameters such as magnitude, size or redshift and exhibiting complex morphologies. Such model can allow us to inexpensively produce large set of realistic realistic images for calibration purposes.We implement a conditional generative model based on state-of-the-art deep learning methods and fit it to deep galaxy images from the COSMOS survey. The quality of the model is assessed by computing an extensive set of galaxy morphology statistics on the generated images. Beyond simple second moment statistics such as size and ellipticity, we apply more complex statistics specifically designed to be sensitive to disturbed galaxy morphologies. We find excellent agreement between the morphologies of real and model generated galaxies.Our results

  11. Study of statistical properties of hybrid statistic in coherent multidetector compact binary coalescences search

    Science.gov (United States)

    Haris, K.; Pai, Archana

    2016-05-01

    In this article, we revisit the coherent gravitational wave search problem of compact binary coalescences with multidetector network consisting of advanced interferometers like LIGO-Virgo. Based on the loss of the optimal multidetector signal-to-noise ratio (SNR), we construct a hybrid statistic as a best of maximum-likelihood-ratio (MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybrid statistic is studied. The performance of this hybrid statistic is compared with that of the coherent MLR statistic for generic inclination angles. Owing to the single synthetic data stream, the hybrid statistic gives few false alarms compared to the multidetector MLR statistic and small fractional loss in the optimum SNR for a large range of binary inclinations. We demonstrate that, for a LIGO-Virgo network and binary inclination ɛ 11 0 ° , the hybrid statistic captures more than 98% of the network optimum matched filter SNR with a low false alarm rate. The Monte Carlo exercise with two distributions of incoming inclination angles—namely, U [cos ɛ ] and a more realistic distribution proposed by B. F. Schutz [Classical Quantum Gravity 28, 125023 (2011)]—are performed with the hybrid statistic and give approximately 5% and 7% higher detection probabilities, respectively, compared to the two stream multidetector MLR statistic for a fixed false alarm probability of 1 0-5.

  12. 基于统计分析的弱变异测试可执行路径生成%Feasible Path Generation of Weak Mutation Testing Based on Statistical Analysis

    Institute of Scientific and Technical Information of China (English)

    党向盈; 巩敦卫; 姚香娟

    2016-01-01

    Mutation testing is a fault-based testing technique.The high cost,however,limits its widespread applications in practical testing.Papadakis et al.transformed the problem of weak mutation testing of a program into that of covering the true branches of mutant statements of another program,with the purpose of generating mutation test data by using previous methods of branch coverage.The converted program contains,however,a great number of mutant branches by using the above approach,thus having a difficulty in generating test data that cover these branches.If appropriate methods are employed to reduce the mutant branches in the converted program,and the reduced mutant branches are grouped according to the paths to which they belong,mutation test data with high quality will be generated by using previous methods of path coverage,hence improving the efficiency of weak mutation testing.Effective methods for generating feasible paths based on a program and its mutants are,however,of absence up to date.In view of this,this paper proposes an approach to generate feasible paths for weak mutation testing by considering the correlation of the true branches of mutant statements,with the purpose of killing all mutants by test data that cover all these feasible paths.To fulfill this task,the dominance relation of the true branches of mutant statements is first determined and employed to reduce the dominated true branches.Following that the non-dominated branches of mutant statements are instrumented into the program to form another program.The true branches of mutant statements that are generated by mutating the same statement are transformed into a new one based on their correlation.Then feasible sub-paths that contain these new branches and the original statement are produced based on the correlation among the original statement and the new branches.Finally, a correlation matrix is generated and reduced based on the execution of these sub-paths using statistical analysis,and one

  13. Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure

    Science.gov (United States)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  14. Development of a realistic in vivo bone metastasis model of human renal cell carcinoma.

    Science.gov (United States)

    Valta, Maija P; Zhao, Hongjuan; Ingels, Alexandre; Thong, Alan E; Nolley, Rosalie; Saar, Matthias; Peehl, Donna M

    2014-06-01

    About one-third of patients with advanced renal cell carcinoma (RCC) have bone metastases. The incidence of RCC is increasing and bone metastatic RCC merits greater focus. Realistic preclinical bone metastasis models of RCC are lacking, hampering the development of effective therapies. We developed a realistic in vivo bone metastasis model of human RCC by implanting precision-cut tissue slices under the renal capsule of immunodeficient mice. The presence of disseminated cells in bone marrow of tissue slice graft (TSG)-bearing mice was screened by human-specific polymerase chain reaction and confirmed by immunohistology using human-specific antibody. Disseminated tumor cells in bone marrow of TSG-bearing mice derived from three of seven RCC patients were detected as early as 1 month after tissue implantation at a high frequency with close resemblance to parent tumors (e.g., CAIX expression and high vascularity). The metastatic patterns of TSGs correlated with disease progression in patients. In addition, TSGs retained capacity to metastasize to bone at high frequency after serial passaging and cryopreservation. Moreover, bone metastases in mice responded to Temsirolimus treatment. Intratibial injections of single cells generated from TSGs showed 100 % engraftment and produced X-ray-visible tumors as early as 3 weeks after cancer cell inoculation. Micro-computed tomography (μCT) and histological analysis revealed osteolytic characteristics of these lesions. Our results demonstrated that orthotopic RCC TSGs have potential to develop bone metastases that respond to standard therapy. This first reported primary RCC bone metastasis model provides a realistic setting to test therapeutics to prevent or treat bone metastases in RCC.

  15. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan

    2012-01-01

    The wind speed represents the main exogenous signal applied to a Wind Energy Conversion System (WECS) and determines its behavior. The erratic variation of the wind speed, highly dependent on the given site and on the atmospheric conditions, makes the wind speed quite difficult to model. Moreover...

  16. Temperature simulations in tissue with a realistic computer generated vessel network.

    Science.gov (United States)

    Van Leeuwen, G M; Kotte, A N; Raaymakers, B W; Lagendijk, J J

    2000-04-01

    The practical use of a discrete vessel thermal model for hyperthermia treatment planning requires a number of choices with respect to the unknown part of the patient's vasculature. This work presents a study of the thermal effects of blood flow in a simple tissue geometry with a detailed artificial vessel network. The simulations presented here demonstrate that an incomplete discrete description of the detailed network results in a better prediction of the temperature distribution than is obtained using the conventional bio-heatsink equation. Therefore, efforts to obtain information on the positions of the large vessels in an individual hyperthermia patient will be rewarded with a more accurate prediction of the temperature distribution.

  17. Predict! Teaching Statistics Using Informational Statistical Inference

    Science.gov (United States)

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  18. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  19. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...... and combined effects of several parameters on key responses. Results supported selection of production parameters meeting specified quality and cost targets, as well as substantial improvements....

  20. Generation of higher order Gauss-Laguerre modes in single-pass 2nd harmonic generation

    DEFF Research Database (Denmark)

    Buchhave, Preben; Tidemand-Lichtenberg, Peter

    2008-01-01

    We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes.......We present a realistic method for dynamic simulation of the development of higher order modes in second harmonic generation. The deformation of the wave fronts due to the nonlinear interaction is expressed by expansion in higher order Gauss-Laguerre modes....

  1. Classical Statistics and Statistical Learning in Imaging Neuroscience

    Directory of Open Access Journals (Sweden)

    Danilo Bzdok

    2017-10-01

    Full Text Available Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques.

  2. Parsing statistical machine translation output

    NARCIS (Netherlands)

    Carter, S.; Monz, C.; Vetulani, Z.

    2009-01-01

    Despite increasing research into the use of syntax during statistical machine translation, the incorporation of syntax into language models has seen limited success. We present a study of the discriminative abilities of generative syntax-based language models, over and above standard n-gram models,

  3. Statistical Topics Concerning Radiometer Theory

    CERN Document Server

    Hunter, Todd R

    2015-01-01

    We present a derivation of the radiometer equation based on the original references and fundamental statistical concepts. We then perform numerical simulations of white noise to illustrate the radiometer equation in action. Finally, we generate 1/f and 1/f^2 noise, demonstrate that it is non-stationary, and use it to simulate the effect of gain fluctuations on radiometer performance.

  4. Using digital colour to increase the realistic appearance of SEM micrographs of bloodstains.

    Science.gov (United States)

    Hortolà, Policarp

    2010-10-01

    Although in the scientific-research literature the micrographs from scanning electron microscopes (SEMs) are usually displayed in greyscale, the potential of colour resources provided by the SEM-coupled image-acquiring systems and, subsidiarily, by image-manipulation free softwares deserves be explored as a tool for colouring SEM micrographs of bloodstains. After acquiring greyscale SEM micrographs of a (dark red to the naked eye) human blood smear on grey chert, they were manually obtained in red tone using both the SEM-coupled image-acquiring system and an image-manipulation free software, as well as they were automatically generated in thermal tone using the SEM-coupled system. Red images obtained by the SEM-coupled system demonstrated lower visual-discrimination capability than the other coloured images, whereas those in red generated by the free software rendered better magnitude of scopic information than the red images generated by the SEM-coupled system. Thermal-tone images, although were further from the real sample colour than the red ones, not only increased their realistic appearance over the greyscale images, but also yielded the best visual-discrimination capability among all the coloured SEM micrographs, and fairly enhanced the relief effect of the SEM micrographs over both the greyscale and the red images. The application of digital colour by means of the facilities provided by an SEM-coupled image-acquiring system or, when required, by an image-manipulation free software provides a user-friendly, quick and inexpensive way of obtaining coloured SEM micrographs of bloodstains, avoiding to do sophisticated, time-consuming colouring procedures. Although this work was focused on bloodstains, well probably other monochromatic or quasi-monochromatic samples are also susceptible of increasing their realistic appearance by colouring them using the simple methods utilized in this study.

  5. Statistics in action a Canadian outlook

    CERN Document Server

    Lawless, Jerald F

    2014-01-01

    Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c

  6. Scaling up complex interventions: insights from a realist synthesis.

    Science.gov (United States)

    Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan

    2016-12-19

    Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant

  7. A continuous family of realistic Susy SU(5) GUTs

    Science.gov (United States)

    Bajc, Borut

    2016-06-01

    It is shown that the minimal renormalizable supersymmetric SU(5) is still realistic providing the supersymmetric scale is at least few tens of TeV or large R-parity violating terms are considered. In the first case the vacuum is metastable, and different consistency constraints can give a bounded allowed region in the tan β - msusy plane. In the second case the mass eigenstate electron (down quark) is a linear combination of the original electron (down quark) and Higgsino (heavy colour triplet), and the mass ratio of bino and wino is determined. Both limits lead to light gravitino dark matter.

  8. ROC Analysis and a Realistic Model of Heart Rate Variability

    CERN Document Server

    Thurner, S; Teich, M C; Thurner, Stefan; Feurstein, Markus C.; Teich, Malvin C.

    1998-01-01

    We have carried out a pilot study on a standard collection of electrocardiograms from patients who suffer from congestive heart failure, and subjects without cardiac pathology, using receiver-operating-characteristic (ROC) analysis. The scale-dependent wavelet-coefficient standard deviation superior to two commonly used measures of cardiac dysfunction when the two classes of patients cannot be completely separated. A jittered integrate-and-fire model with a fractal Gaussian-noise kernel provides a realistic simulation of heartbeat sequences for both heart-failure patients and normal subjects.

  9. Recovering the quantum formalism from physically realist axioms.

    Science.gov (United States)

    Auffèves, Alexia; Grangier, Philippe

    2017-03-03

    We present a heuristic derivation of Born's rule and unitary transforms in Quantum Mechanics, from a simple set of axioms built upon a physical phenomenology of quantization. This approach naturally leads to the usual quantum formalism, within a new realistic conceptual framework that is discussed in details. Physically, the structure of Quantum Mechanics appears as a result of the interplay between the quantized number of "modalities" accessible to a quantum system, and the continuum of "contexts" that are required to define these modalities. Mathematically, the Hilbert space structure appears as a consequence of a specific "extra-contextuality" of modalities, closely related to the hypothesis of Gleason's theorem, and consistent with its conclusions.

  10. Centralized Cooperative Positioning and Tracking with Realistic Communications Constraints

    DEFF Research Database (Denmark)

    Mensing, Christian; Nielsen, Jimmy Jessen

    2010-01-01

    In this paper, we investigate the performance of centralized cooperative positioning algorithms. Compared to traditional positioning algorithms which solely exploit ranging information from anchor nodes, cooperative positioning additionally uses measurements from peer-to-peer links between...... on the overall performance will be assessed. As we are considering a dynamic scenario, the cooperative positioning algorithms are based on extended Kalman filtering for position estimation and tracking. Simulation results for ultra-wideband based ranging information and WLAN based communications infrastructure...... show the benefits of cooperative position and tracking for realistic measurement and mobility models....

  11. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.;

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed...... by Froese et al. are realistic and consistent. We further show that the assumption about density-dependence being described by a stock recruitment relationship is responsible for determining whether a peak in the cohort biomass of a population occurs late or early in life. Finally, we argue...

  12. Mode-Coupling in Realistic Rotating Gravitational Collapse

    CERN Document Server

    Hod, S

    2000-01-01

    We analyze the mode-coupling phenomena in realistic rotating gravitational collapse. Physically, this phenomena is caused by the dragging of reference frames, due to the black-hole (or star's) rotation. It is shown that different modes become coupled during the rotating collapse. As a consequence, the asymptotic late-time tails are dominated by modes which, in general, have an angular distribution different from the original one. We show that a rotating Kerr black hole becomes ``bald'' slower than a spherically-symmetric Schwarzschild black hole. This paper considers gravitational, electromagnetic and neutrino fields propagating on a Kerr background.

  13. Ultra-Reliable Communications in Failure-Prone Realistic Networks

    DEFF Research Database (Denmark)

    Gerardino, Guillermo Andrés Pocovi; Lauridsen, Mads; Alvarez, Beatriz Soret

    2016-01-01

    We investigate the potential of different diversity and interference management techniques to achieve the required downlink SINR outage probability for ultra-reliable communications. The evaluation is performed in a realistic network deployment based on site-specific data from a European capital....... Micro and macroscopic diversity techniques are proved to be important enablers of ultra-reliable communications. Particularly, it is shown how a 4x4 MIMO scheme with three orders of macroscopic diversity can achieve the required SINR outage performance. Smaller gains are obtained from interference...

  14. Dynamic Enhanced Inter-Cell Interference Coordination for Realistic Networks

    DEFF Research Database (Denmark)

    Pedersen, Klaus I.; Alvarez, Beatriz Soret; Barcos, Sonia;

    2016-01-01

    Enhanced Inter-Cell Interference Coordination (eICIC) is a key ingredient to boost the performance of co-channel Heterogeneous Networks (HetNets). eICIC encompasses two main techniques: Almost Blank Subframes (ABS), during which the macro cell remains silent to reduce the interference, and biased...... and an opportunistic approach exploiting the varying cell conditions. Moreover, an autonomous fast distributed muting algorithm is presented, which is simple, robust, and well suited for irregular network deployments. Performance results for realistic network deployments show that the traditional semi-static e...

  15. Can realistic nuclear interactions tolerate a resonant tetraneutron ?

    CERN Document Server

    Lazauskas, R

    2005-01-01

    The possible existence of four-neutron resonances close to the physical energy region is explored. Faddeev-Yakubovsky equations have been solved in configuration space using realistic nucleon-nucleon interaction models. Complex Scaling and Analytical Continuation in the Coupling Constant methods were used to follow the resonance pole trajectories, which emerge out of artificially bound tetraneutron states. The final pole positions for four-neutron states lie in the third energy quadrant with negative real energy parts and should thus not be physically observable.

  16. The KM phase in semi-realistic heterotic orbifold models

    Energy Technology Data Exchange (ETDEWEB)

    Giedt, Joel

    2000-07-05

    In string-inspired semi-realistic heterotic orbifolds models with an anomalous U(1){sub X},a nonzero Kobayashi-Masakawa (KM) phase is shown to arise generically from the expectation values of complex scalar fields, which appear in nonrenormalizable quark mass couplings. Modular covariant nonrenormalizable superpotential couplings are constructed. A toy Z{sub 3} orbifold model is analyzed in some detail. Modular symmetries and orbifold selection rules are taken into account and do not lead to a cancellation of the KM phase. We also discuss attempts to obtain the KM phase solely from renormalizable interactions.

  17. A Modern Syllogistic Method in Intuitionistic Fuzzy Logic with Realistic Tautology

    Directory of Open Access Journals (Sweden)

    Ali Muhammad Rushdi

    2015-01-01

    Full Text Available The Modern Syllogistic Method (MSM of propositional logic ferrets out from a set of premises all that can be concluded from it in the most compact form. The MSM combines the premises into a single function equated to 1 and then produces the complete product of this function. Two fuzzy versions of MSM are developed in Ordinary Fuzzy Logic (OFL and in Intuitionistic Fuzzy Logic (IFL with these logics augmented by the concept of Realistic Fuzzy Tautology (RFT which is a variable whose truth exceeds 0.5. The paper formally proves each of the steps needed in the conversion of the ordinary MSM into a fuzzy one. The proofs rely mainly on the successful replacement of logic 1 (or ordinary tautology by an RFT. An improved version of Blake-Tison algorithm for generating the complete product of a logical function is also presented and shown to be applicable to both crisp and fuzzy versions of the MSM. The fuzzy MSM methodology is illustrated by three specific examples, which delineate differences with the crisp MSM, address the question of validity values of consequences, tackle the problem of inconsistency when it arises, and demonstrate the utility of the concept of Realistic Fuzzy Tautology.

  18. Realistic multi-cellular dosimetry for (177)Lu-labelled antibodies: model and application.

    Science.gov (United States)

    Marcatili, S; Pichard, A; Courteau, A; Ladjohounlou, R; Navarro-Teulon, I; Repetto-Llamazares, A; Heyerdahl, H; Dahle, J; Pouget, J P; Bardiès, M

    2016-10-07

    Current preclinical dosimetric models often fail to take account of the complex nature of absorbed dose distribution typical of in vitro clonogenic experiments in targeted radionuclide therapy. For this reason, clonogenic survival is often expressed as a function of added activity rather than the absorbed dose delivered to cells/cell nuclei. We designed a multi-cellular dosimetry model that takes into account the realistic distributions of cells in the Petri dish, for the establishment of survival curves as a function of the absorbed dose. General-purpose software tools were used for the generation of realistic, randomised 3D cell culture geometries based on experimentally determined parameters (cell size, cell density, cluster density, average cluster size, cell cumulated activity). A mixture of Monte Carlo and analytical approaches was implemented in order to achieve as accurate as possible results while reducing calculation time. The model was here applied to clonogenic survival experiments carried out to compare the efficacy of Betalutin(®), a novel (177)Lu-labelled antibody radionuclide conjugate for the treatment of non-Hodgkin lymphoma, to that of (177)Lu-labelled CD20-specific (rituximab) and non-specific antibodies (Erbitux) on lymphocyte B cells. The 3D cellular model developed allowed a better understanding of the radiative and non-radiative processes associated with cellular death. Our approach is generic and can also be applied to other radiopharmaceuticals and cell distributions.

  19. Realistic multi-cellular dosimetry for 177Lu-labelled antibodies: model and application

    Science.gov (United States)

    Marcatili, S.; Pichard, A.; Courteau, A.; Ladjohounlou, R.; Navarro-Teulon, I.; Repetto-Llamazares, A.; Heyerdahl, H.; Dahle, J.; Pouget, J. P.; Bardiès, M.

    2016-10-01

    Current preclinical dosimetric models often fail to take account of the complex nature of absorbed dose distribution typical of in vitro clonogenic experiments in targeted radionuclide therapy. For this reason, clonogenic survival is often expressed as a function of added activity rather than the absorbed dose delivered to cells/cell nuclei. We designed a multi-cellular dosimetry model that takes into account the realistic distributions of cells in the Petri dish, for the establishment of survival curves as a function of the absorbed dose. General-purpose software tools were used for the generation of realistic, randomised 3D cell culture geometries based on experimentally determined parameters (cell size, cell density, cluster density, average cluster size, cell cumulated activity). A mixture of Monte Carlo and analytical approaches was implemented in order to achieve as accurate as possible results while reducing calculation time. The model was here applied to clonogenic survival experiments carried out to compare the efficacy of Betalutin®, a novel 177Lu-labelled antibody radionuclide conjugate for the treatment of non-Hodgkin lymphoma, to that of 177Lu-labelled CD20-specific (rituximab) and non-specific antibodies (Erbitux) on lymphocyte B cells. The 3D cellular model developed allowed a better understanding of the radiative and non-radiative processes associated with cellular death. Our approach is generic and can also be applied to other radiopharmaceuticals and cell distributions.

  20. A Modern Syllogistic Method in Intuitionistic Fuzzy Logic with Realistic Tautology.

    Science.gov (United States)

    Rushdi, Ali Muhammad; Zarouan, Mohamed; Alshehri, Taleb Mansour; Rushdi, Muhammad Ali

    2015-01-01

    The Modern Syllogistic Method (MSM) of propositional logic ferrets out from a set of premises all that can be concluded from it in the most compact form. The MSM combines the premises into a single function equated to 1 and then produces the complete product of this function. Two fuzzy versions of MSM are developed in Ordinary Fuzzy Logic (OFL) and in Intuitionistic Fuzzy Logic (IFL) with these logics augmented by the concept of Realistic Fuzzy Tautology (RFT) which is a variable whose truth exceeds 0.5. The paper formally proves each of the steps needed in the conversion of the ordinary MSM into a fuzzy one. The proofs rely mainly on the successful replacement of logic 1 (or ordinary tautology) by an RFT. An improved version of Blake-Tison algorithm for generating the complete product of a logical function is also presented and shown to be applicable to both crisp and fuzzy versions of the MSM. The fuzzy MSM methodology is illustrated by three specific examples, which delineate differences with the crisp MSM, address the question of validity values of consequences, tackle the problem of inconsistency when it arises, and demonstrate the utility of the concept of Realistic Fuzzy Tautology.

  1. Revised rate coefficients for H$_2$ and H$^-$ destruction by realistic stellar spectra

    CERN Document Server

    Agarwal, Bhaskar

    2014-01-01

    Understanding the processes that can destroy H$_2$ and H$^-$ species is quintessential in governing the formation of the first stars, black holes and galaxies. In this study we compute the reaction rate coefficients for H$_2$ photo--dissociation by Lyman--Werner photons ($11.2 - 13.6$ eV), and H$^-$ photo--detachment by 0.76 eV photons emanating from self-consistent stellar populations that we model using publicly available stellar synthesis codes. So far studies that include chemical networks for the formation of molecular hydrogen take these processes into account by assuming that the source spectra can be approximated by a power-law dependency or a black-body spectrum at 10$^4$ or $10^5$ K. We show that using spectra generated from realistic stellar population models can alter the reaction rates for photo-dissociation, $\\rm k_{\\rm{di}}$, and photo-detachment, $\\rm k_{\\rm{de}}$, significantly. In particular, $\\rm k_{\\rm{de}}$ can be up to $\\sim 2-4$ orders of magnitude lower in the case of realistic stellar...

  2. Recent advances on thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Alexandro S., E-mail: alexandrossilva@ifba.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia da Bahia (IFBA), Vitoria da Conquista, BA (Brazil); Mazaira, Leorlen Y.R., E-mail: leored1984@gmail.com, E-mail: cgh@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas (INSTEC), La Habana (Cuba); Dominguez, Dany S.; Hernandez, Carlos R.G., E-mail: alexandrossilva@gmail.com, E-mail: dsdominguez@gmail.com [Universidade Estadual de Santa Cruz (UESC), Ilheus, BA (Brazil). Programa de Pos-Graduacao em Modelagem Computacional; Lira, Carlos A.B.O., E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil)

    2015-07-01

    High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal-hydraulic characteristics. In this article, it was performed the thermal-hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a FCC (Face Centered Cubic) cell with the half height of the core, with 21 layers and 95 pebbles. The input data used were taken from the thermal-hydraulic IAEA Bechmark. The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)

  3. Measurement of time delays in gated radiotherapy for realistic respiratory motions.

    Science.gov (United States)

    Chugh, Brige P; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L

    2014-09-01

    Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients with highly irregular patterns of motion is

  4. Statistical methods for forecasting

    CERN Document Server

    Abraham, Bovas

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...

  5. Realistic simulation of the Space-borne Compton Polarimeter POLAR

    Science.gov (United States)

    Xiao, Hualin

    2016-07-01

    POLAR is a compact wide field space-borne detector dedicated for precise measurements of the linear polarization of hard x-rays emitted by transient sources. Its energy range sensitivity is optimized for the detection of the prompt emission of Gamma-ray bursts (GRBs). POLAR is developed by an international collaboration of China, Switzerland and Poland. It is planned to be launched into space in 2016 onboard the Chinese space laboratory TG2. The energy range of POLAR spans between 50 keV and 500 keV. POLAR detects gamma rays with an array of 1600 plastic scintillator bars read out by 25 muti-anode PMTs (MAPMTs). Polarization measurements use Compton scattering process and are based on detection of energy depositions in the scintillator bars. Reconstruction of the polarization degree and polarization angle of GRBs requires comparison of experimental modulation curves with realistic simulations of the full instrument response. In this paper we present a method to model and parameterize the detector response including efficiency of the light collection, contributions from crosstalk and non-uniformity of MAPMTs as well as dependency on low energy detection thresholds and noise from readout electronics. The performance of POLAR for determination of polarization is predicted with such realistic simulations and carefully cross-checked with dedicated laboratory tests.

  6. Neuronize: a tool for building realistic neuronal cell morphologies

    Science.gov (United States)

    Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis; DeFelipe, Javier; Benavides-Piccione, Ruth

    2013-01-01

    This study presents a tool, Neuronize, for building realistic three-dimensional models of neuronal cells from the morphological information extracted through computer-aided tracing applications. Neuronize consists of a set of methods designed to build 3D neural meshes that approximate the cell membrane at different resolution levels, allowing a balance to be reached between the complexity and the quality of the final model. The main contribution of the present study is the proposal of a novel approach to build a realistic and accurate 3D shape of the soma from the incomplete information stored in the digitally traced neuron, which usually consists of a 2D cell body contour. This technique is based on the deformation of an initial shape driven by the position and thickness of the first order dendrites. The addition of a set of spines along the dendrites completes the model, building a final 3D neuronal cell suitable for its visualization in a wide range of 3D environments. PMID:23761740

  7. Diffusive and dynamical radiating stars with realistic equations of state

    Science.gov (United States)

    Brassel, Byron P.; Maharaj, Sunil D.; Goswami, Rituparno

    2017-03-01

    We model the dynamics of a spherically symmetric radiating dynamical star with three spacetime regions. The local internal atmosphere is a two-component system consisting of standard pressure-free, null radiation and an additional string fluid with energy density and nonzero pressure obeying all physically realistic energy conditions. The middle region is purely radiative which matches to a third region which is the Schwarzschild exterior. A large family of solutions to the field equations are presented for various realistic equations of state. We demonstrate that it is possible to obtain solutions via a direct integration of the second order equations resulting from the assumption of an equation of state. A comparison of our solutions with earlier well known results is undertaken and we show that all these solutions, including those of Husain, are contained in our family. We then generalise our class of solutions to higher dimensions. Finally we consider the effects of diffusive transport and transparently derive the specific equations of state for which this diffusive behaviour is possible.

  8. CHARMM-GUI Membrane Builder toward realistic biological membrane simulations.

    Science.gov (United States)

    Wu, Emilia L; Cheng, Xi; Jo, Sunhwan; Rui, Huan; Song, Kevin C; Dávila-Contreras, Eder M; Qi, Yifei; Lee, Jumin; Monje-Galvan, Viviana; Venable, Richard M; Klauda, Jeffery B; Im, Wonpil

    2014-10-15

    CHARMM-GUI Membrane Builder, http://www.charmm-gui.org/input/membrane, is a web-based user interface designed to interactively build all-atom protein/membrane or membrane-only systems for molecular dynamics simulations through an automated optimized process. In this work, we describe the new features and major improvements in Membrane Builder that allow users to robustly build realistic biological membrane systems, including (1) addition of new lipid types, such as phosphoinositides, cardiolipin (CL), sphingolipids, bacterial lipids, and ergosterol, yielding more than 180 lipid types, (2) enhanced building procedure for lipid packing around protein, (3) reliable algorithm to detect lipid tail penetration to ring structures and protein surface, (4) distance-based algorithm for faster initial ion displacement, (5) CHARMM inputs for P21 image transformation, and (6) NAMD equilibration and production inputs. The robustness of these new features is illustrated by building and simulating a membrane model of the polar and septal regions of E. coli membrane, which contains five lipid types: CL lipids with two types of acyl chains and phosphatidylethanolamine lipids with three types of acyl chains. It is our hope that CHARMM-GUI Membrane Builder becomes a useful tool for simulation studies to better understand the structure and dynamics of proteins and lipids in realistic biological membrane environments.

  9. Music therapy for palliative care: A realist review.

    Science.gov (United States)

    McConnell, Tracey; Porter, Sam

    2017-08-01

    Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therapy for palliative care. In keeping with the realist approach, we examined all relevant literature to develop theories that could explain how music therapy works. A total of 51 articles were included in the review. Music therapy was found to have a therapeutic effect on the physical, psychological, emotional, and spiritual suffering of palliative care patients. We also identified program mechanisms that help explain music therapy's therapeutic effects, along with facilitating contexts for implementation. Music therapy may be an effective nonpharmacological approach to managing distressing symptoms in palliative care patients. The findings also suggest that group music therapy may be a cost-efficient and effective way to support staff caring for palliative care patients. We encourage others to continue developing the evidence base in order to expand our understanding of how music therapy works, with the aim of informing and improving the provision of music therapy for palliative care patients.

  10. Radiation-Spray Coupling for Realistic Flow Configurations

    Science.gov (United States)

    El-Asrag, Hossam; Iannetti, Anthony C.

    2011-01-01

    Three Large Eddy Simulations (LES) for a lean-direct injection (LDI) combustor are performed and compared. In addition to the cold flow simulation, the effect of radiation coupling with the multi-physics reactive flow is analyzed. The flame let progress variable approach is used as a subgrid combustion model combined with a stochastic subgrid model for spray atomization and an optically thin radiation model. For accurate chemistry modeling, a detailed Jet-A surrogate mechanism is utilized. To achieve realistic inflow, a simple recycling technique is performed at the inflow section upstream of the swirler. Good comparison is shown with the experimental data mean and root mean square profiles. The effect of combustion is found to change the shape and size of the central recirculation zone. Radiation is found to change the spray dynamics and atomization by changing the heat release distribution and the local temperature values impacting the evaporation process. The simulation with radiation modeling shows wider range of droplet size distribution by altering the evaporation rate. The current study proves the importance of radiation modeling for accurate prediction in realistic spray combustion configurations, even for low pressure systems.

  11. Validation of a realistic simulator for veterinary gastrointestinal endoscopy training.

    Science.gov (United States)

    Usón-Gargallo, Jesús; Usón-Casaús, Jesús M; Pérez-Merino, Eva M; Soria-Gálvez, Federico; Morcillo, Esther; Enciso, Silvia; Sánchez-Margallo, Francisco M

    2014-01-01

    This article reports on the face, content, and construct validity of a new realistic composite simulator (Simuldog) used to provide training in canine gastrointestinal flexible endoscopy. The basic endoscopic procedures performed on the simulator were esophagogastroduodenoscopy (EGD), gastric biopsy (GB), and gastric foreign body removal (FBR). Construct validity was assessed by comparing the performance of novices (final-year veterinary students and recent graduates without endoscopic experience, n=30) versus experienced subjects (doctors in veterinary medicine who had performed more than 50 clinical upper gastrointestinal endoscopic procedures as a surgeon, n=15). Tasks were scored based on completion time, and specific rating scales were developed to assess performance. Internal consistency and inter-rater agreement were assessed. Face and content validity were determined using a 5-point Likert-type scale questionnaire. The novices needed considerably more time than the experts to perform EGD, GB, and FBR, and their performance scores were significantly lower (pendoscopy scenarios were very realistic. The experts highly valued the usefulness of Simuldog for veterinary training and as a tool for assessing endoscopic skills. Simuldog is the first validated model specifically developed to be used as a training tool for endoscopy techniques in small animals.

  12. A general realistic treatment of the disk paradox

    Science.gov (United States)

    Pantazis, George; Perivolaropoulos, Leandros

    2017-01-01

    Mechanical angular momentum is not conserved in systems involving electromagnetic fields with non-zero electromagnetic field angular momentum. Conservation is restored only if the total (mechanical and field) angular momentum is considered. Previous studies have investigated this effect, known as ‘Feynman’s Electromagnetic Paradox’ or simply ‘Disk Paradox’ in the context of idealized systems (infinite or infinitesimal solenoids and charged cylinders etc). In the present analysis we generalize previous studies by considering more realistic systems with finite components and demonstrating explicitly the conservation of the total angular momentum. This is achieved by expressing both the mechanical and the field angular momentum in terms of charges and magnetic field fluxes through various system components. Using this general expression and the closure of magnetic field lines, we demonstrate explicitly the conservation of total angular momentum in both idealized and realistic systems (finite solenoid concentric with two charged cylinders, much longer than the solenoid) taking all the relevant terms into account including the flux outside of the solenoid. This formalism has the potential to facilitate a simpler and more accurate demonstration of total angular momentum conservation in undergraduate physics electromagnetism laboratories.

  13. Time variability of α from realistic models of Oklo reactors

    Science.gov (United States)

    Gould, C. R.; Sharapov, E. I.; Lamoreaux, S. K.

    2006-08-01

    We reanalyze Oklo Sm149 data using realistic models of the natural nuclear reactors. Disagreements among recent Oklo determinations of the time evolution of α, the electromagnetic fine structure constant, are shown to be due to different reactor models, which led to different neutron spectra used in the calculations. We use known Oklo reactor epithermal spectral indices as criteria for selecting realistic reactor models. Two Oklo reactors, RZ2 and RZ10, were modeled with MCNP. The resulting neutron spectra were used to calculate the change in the Sm149 effective neutron capture cross section as a function of a possible shift in the energy of the 97.3-meV resonance. We independently deduce ancient Sm149 effective cross sections and use these values to set limits on the time variation of α. Our study resolves a contradictory situation with previous Oklo α results. Our suggested 2σ bound on a possible time variation of α over 2 billion years is stringent: -0.11≤Δα/α≤0.24, in units of 10-7, but model dependent in that it assumes only α has varied over time.

  14. Neural Correlates of Realistic and Unrealistic Auditory Space Perception

    Directory of Open Access Journals (Sweden)

    Akiko Callan

    2011-10-01

    Full Text Available Binaural recordings can simulate externalized auditory space perception over headphones. However, if the orientation of the recorder's head and the orientation of the listener's head are incongruent, the simulated auditory space is not realistic. For example, if a person lying flat on a bed listens to an environmental sound that was recorded by microphones inserted in ears of a person who was in an upright position, the sound simulates an auditory space rotated 90 degrees to the real-world horizontal axis. Our question is whether brain activation patterns are different between the unrealistic auditory space (ie, the orientation of the listener's head and the orientation of the recorder's head are incongruent and the realistic auditory space (ie, the orientations are congruent. River sounds that were binaurally recorded either in a supine position or in an upright body position were served as auditory stimuli. During fMRI experiments, participants listen to the stimuli and pressed one of two buttons indicating the direction of the water flow (horizontal/vertical. Behavioral results indicated that participants could not differentiate between the congruent and the incongruent conditions. However, neuroimaging results showed that the congruent condition activated the planum temporale significantly more than the incongruent condition.

  15. Functional consequences of realistic biodiversity changes in a marine ecosystem.

    Science.gov (United States)

    Bracken, Matthew E S; Friberg, Sara E; Gonzalez-Dorantes, Cirse A; Williams, Susan L

    2008-01-22

    Declines in biodiversity have prompted concern over the consequences of species loss for the goods and services provided by natural ecosystems. However, relatively few studies have evaluated the functional consequences of realistic, nonrandom changes in biodiversity. Instead, most designs have used randomly selected assemblages from a local species pool to construct diversity gradients. It is therefore difficult, based on current evidence, to predict the functional consequences of realistic declines in biodiversity. In this study, we used tide pool microcosms to demonstrate that the effects of real-world changes in biodiversity may be very different from those of random diversity changes. Specifically, we measured the relationship between the diversity of a seaweed assemblage and its ability to use nitrogen, a key limiting nutrient in nearshore marine systems. We quantified nitrogen uptake using both experimental and model seaweed assemblages and found that natural increases in diversity resulted in enhanced rates of nitrogen use, whereas random diversity changes had no effect on nitrogen uptake. Our results suggest that understanding the real-world consequences of declining biodiversity will require addressing changes in species performance along natural diversity gradients and understanding the relationships between species' susceptibility to loss and their contributions to ecosystem functioning.

  16. Ultra-realistic 3-D imaging based on colour holography

    Science.gov (United States)

    Bjelkhagen, H. I.

    2013-02-01

    A review of recent progress in colour holography is provided with new applications. Colour holography recording techniques in silver-halide emulsions are discussed. Both analogue, mainly Denisyuk colour holograms, and digitally-printed colour holograms are described and their recent improvements. An alternative to silver-halide materials are the panchromatic photopolymer materials such as the DuPont and Bayer photopolymers which are covered. The light sources used to illuminate the recorded holograms are very important to obtain ultra-realistic 3-D images. In particular the new light sources based on RGB LEDs are described. They show improved image quality over today's commonly used halogen lights. Recent work in colour holography by holographers and companies in different countries around the world are included. To record and display ultra-realistic 3-D images with perfect colour rendering are highly dependent on the correct recording technique using the optimal recording laser wavelengths, the availability of improved panchromatic recording materials and combined with new display light sources.

  17. Perceived Strengths and Weaknesses of Highly Realistic Training and Live Tissue Training for Navy Corpsmen

    Science.gov (United States)

    2015-04-08

    Naval Health Research Center Perceived Strengths and Weaknesses of Highly Realistic Training and Live Tissue Training for Navy Corpsmen Stephanie...Highly Realistic and Live Tissue Training 1 Perceived Strengths and Weaknesses of Highly Realistic Training and Live Tissue...Highly Realistic and Live Tissue Training 2 ABSTRACT The U.S. Navy currently employs two types of trauma care training for Navy corpsmen: highly

  18. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  19. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  20. Blood Facts and Statistics

    Science.gov (United States)

    ... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...

  1. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  2. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  3. A Non-Perturbative Gauge-Invariant QCD: Ideal vs. Realistic QCD

    CERN Document Server

    Fried, H M; Sheu, Y -M

    2011-01-01

    A basic distinction, long overlooked, between the conventional, "idealistic" formulation of QCD, and a more "realistic" formulation is brought into focus by a rigorous, non-perturbative, gauge-invariant evaluation of the Schwinger solution for the QCD generating functional in terms of exact Fradkin representations for the Green's functional $\\mathbf{G}_{c}(x,y|A)$ and the vacuum functional $\\mathbf{L}[A]$. The quanta of all (Abelian) quantized fields may be expected to obey standard quantum-mechanical measurement properties, perfect position dependence at the cost of unknown momenta, and vice-versa, but this is impossible for quarks since they always appear asymptotically in bound states, and their transverse position or momenta can never, in principle, be exactly measured. Violation of this principle produces an absurdity in the exact evaluation of each and every QCD amplitude. We here suggest a phenomenological change in the basic QCD Lagrangian, such that a limitation of transverse precision is automatical...

  4. Dipole response in neutron-rich nuclei within self-consistent approaches using realistic potentials

    Directory of Open Access Journals (Sweden)

    Lo Iudice N.

    2015-01-01

    Full Text Available A nucleon-nucleon chiral potential with a corrective density dependent term simulating a three-body force is used in a self-consistent calculation of the dipole strength distribution in neutron-rich nuclei, with special attention to the low-lying spectra associated to the pygmy resonance. A Hartree-Fock-Bogoliubov basis is generated and adopted in Tamm-Dancoff and random-phase approximations and, then, in an equation of motion approach which includes a basis of two-phonon states. The direct use of the mentioned chiral potential improves the description of both giant and pygmy dipole modes with respect to other realistic interactions. Moreover, the inclusion of the two-phonon states induces a pronounced fragmentation of the giant resonance and enhances the density of the low-lying levels in the pygmy region in agreement with recent experiments.

  5. Toward Affordable, Theory-and-Simulation-Inspired, Models for Realistic Wind Turbine Aerodynamics and Noise

    Science.gov (United States)

    Ladeinde, Foluso; Alabi, Ken; Li, Wenhai

    2015-11-01

    The problem of generating design data for the operation of a farm of wind turbines for clean energy production is quite complicated, if properly done. Potential flow theories provide some models, but these are not suitable for the massive aerodynamic separation and turbulence that characterize many realistic wind turbine applications. Procedures, such as computational fluid dynamics (CFD), which can potentially resolve some of the accuracy problems with the purely theoretical approach, are quite expensive to use, and often prohibit real-time design and control. In our work, we seek affordable and acceptably-accurate models derived from the foregoing approaches. The simulation used in our study is based on high-fidelity CFD, meaning that we use high-order (compact-scheme based), mostly large-eddy simulation methods, with due regards for the proper treatment of the stochastic inflow turbulence data. Progress on the project described herein will be presented.

  6. Study of airflow during respiratory cycle in semi-realistic model of human tracheobronchial tree

    Science.gov (United States)

    Elcner, Jakub; Zaremba, M.; Maly, M.; Jedelsky, J.; Lizal, F.; Jicha, M.

    2016-06-01

    This article deals with study of airflow under breathing process, which is characteristic by unsteady behavior. Simulations provided by computational fluid dynamics (CFD) was compared with experiments performed on similar geometry of human upper airways. This geometry was represented by mouth cavity of realistic shape connected to an idealized tracheobronchial tree up to fourth generation of branching. Commercial CFD software Star-CCM+ was used to calculate airflow inside investigated geometry and method of Reynolds averaging of Navier-Stokes equations was used for subscribing the turbulent behavior through model geometry. Conditions corresponding to resting state were considered. Comparisons with experiments were provided on several points through trachea and bronchial tree and results with respect to inspiratory and respiratory part of breathing cycle was discussed.

  7. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    Science.gov (United States)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  8. Waste statistics 2003

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The 2003 reporting to the ISAG comprises 403 plants owned by 273 enterprises. In 2002, reports covered 407 plants owned by 296 enterprises. Waste generation in 2003 is compared to targets from 2008 in the government's Waste Strategy 2005-2008. The following can be said to summarise waste generation in 2003: 1) In 2003, total reported waste arisings amounted to 12,835,000 tonnes, which is 270,000 tonnes, or 2 per cent, less than in 2002. 2) If amounts of residues from coal-fired power plants are excluded from statistics, waste arisings in 2003 were 11,597,000 tonnes, which is a 2 per cent increase from 2002. 3) If amounts of residues from coal-fired power plants and waste from the building and construction sector are excluded from statistics, total waste generation in 2003 amounted to 7,814,000 tonnes, which is 19,000 tonnes, or 1 per cent, less than in 2002. In other words, there has been a fall in total waste arisings, if residues and waste from building and construction are excluded. 4) The overall rate of recycling amounted to 66 per cent, which is one percentage point above the overall recycling target of 65 per cent for 2008. In 2002 the total rate of recycling was 64 per cent. 5) The total amount of waste led to incineration amounted to 26 per cent, plus an additional 1 per cent left in temporary storage to be incinerated at a later time. The 2008 target for incineration is 26 per cent. These are the same percentage figures as applied to incineration and storage in 2002. 6) The total amount of waste led to landfills amounted to 8 per cent, which is one percentage point below the overall landfill target of a maximum of 9 per cent landfilling in 2008. In 2002, 9 per cent was led to landfill. 7) The targets for treatment of waste from individual sectors are still not being met: too little waste from households and the service sector is being recycled, and too much waste from industry is being led to landfill. (au)

  9. Waste statistics 2004

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-04-07

    The 2004 reporting to the ISAG comprises 394 plants owned by 256 enterprises. In 2003, reports covered 403 plants owned by 273 enterprises. Waste generation in 2004 is compared to targets for 2008 in the government's Waste Strategy 2005-2008. The following summarises waste generation in 2004: 1) In 2004, total reported waste arisings amounted to 13,359,000 tonnes, which is 745,000 tonnes, or 6 per cent, more than in 2003. 2) If amounts of residues from coal-fired power plants are excluded from statistics, waste arisings in 2004 were 12,179,000 tonnes, which is a 9 per cent increase from 2003. 3) If amounts of residues from coal-fired power plants and waste from the building and construction sector are excluded from statistics, total waste generation in 2004 amounted to 7,684,000 tonnes, which is 328,000 tonnes, or 4 per cent, more than in 2002. In other words, there has been an increase in total waste arisings, if residues and waste from building and construction are excluded. Waste from the building and construction sector is more sensitive to economic change than most other waste. 4) The total rate of recycling was 65 per cent. The 2008 target for recycling is 65 per cent. The rate of recycling in 2003 was also 65 per cent. 5) The total amount of waste led to incineration amounted to 26 per cent, plus an additional 1 per cent left in temporary storage to be incinerated at a later time. The 2008 target for incineration is 26 per cent. These are the same percentage figures as applied to incineration and storage in 2003. 6) The total amount of waste led to landfills amounted to 8 per cent, which is one percentage point better than the overall landfill target of a maximum of 9 per cent landfilling in 2008. Also in 2003, 8 per cent of the waste was landfilled. 7) The targets for treatment of waste from individual sectors are still not being met: too little waste from households and the service sector is being recycled, and too much waste from industry is being

  10. Realistic Reflections for Marine Environments in Augmented Reality Training Systems

    Science.gov (United States)

    2009-09-01

    Static Backgrounds. Top: Agua Background. Bottom: Blue Background.............48 Figure 27. Ship Textures Used to Generate Reflections. In Order from...Figure 26. Two Tested Static Backgrounds. Top: Agua Background. Bottom: Blue Background Figure 27. Ship Textures Used to Generate

  11. Photo-Realistic 3D Modelling of Sculptures on Open-Air Museums

    Directory of Open Access Journals (Sweden)

    Francesca Duca

    2011-12-01

    Full Text Available Laser scanning is a high-end technology with possibilities far ahead the well-known civil engineering and industrial applications. The actual geomatic technologies and methodologies for cultural heritage documentation allow the generation of very realistic 3D results used for many scopes like archaeological documentation, digital conservation, 3D repositories, etc. The fast acquisition times of large number of point clouds in 3D opens up the world of capabilities to document and keep alive cultural heritage, moving forward the generation of virtual animated replicas of great value and smooth multimedia dissemination. This paper presents the use of a terrestrial laser sca nning (TLS as a valuable tool for 3D documentation of large outdoor cultural heritage sculptures such as two of the existing ones inside the “Campus de Vera” of the UPV: “Defensas I” and “Mentoring”. The processing of the TLS data is discussed in detail in order to create photo-realistic digital models. Data acquisition is conducted with a time-of-flight scanner, characterized by its high accuracy, small beam, and ultra-fine scanning. Data processing is performed using Leica Geosystems Cyclone Software for the data registration and 3DReshaper Software for modelling and texturing.  High-resolution images after calibration and orientation of an off-the-shelf digital camera are draped onto the models to achieve right appearance in colour and texture. A discussion on the differences found out when modelling sculptures with different deviation errors will be presented. Processing steps such as normal smoothing and vertices recalculation are found appropriate to achieve continuous meshes around the objects.

  12. A realistic validation study of a new nitrogen multiple-breath washout system.

    Directory of Open Access Journals (Sweden)

    Florian Singer

    Full Text Available BACKGROUND: For reliable assessment of ventilation inhomogeneity, multiple-breath washout (MBW systems should be realistically validated. We describe a new lung model for in vitro validation under physiological conditions and the assessment of a new nitrogen (N(2MBW system. METHODS: The N(2MBW setup indirectly measures the N(2 fraction (F(N2 from main-stream carbon dioxide (CO(2 and side-stream oxygen (O(2 signals: F(N2 = 1-F(O2-F(CO2-F(Argon. For in vitro N(2MBW, a double chamber plastic lung model was filled with water, heated to 37°C, and ventilated at various lung volumes, respiratory rates, and F(CO2. In vivo N(2MBW was undertaken in triplets on two occasions in 30 healthy adults. Primary N(2MBW outcome was functional residual capacity (FRC. We assessed in vitro error (√[difference](2 between measured and model FRC (100-4174 mL, and error between tests of in vivo FRC, lung clearance index (LCI, and normalized phase III slope indices (S(acin and S(cond. RESULTS: The model generated 145 FRCs under BTPS conditions and various breathing patterns. Mean (SD error was 2.3 (1.7%. In 500 to 4174 mL FRCs, 121 (98% of FRCs were within 5%. In 100 to 400 mL FRCs, the error was better than 7%. In vivo FRC error between tests was 10.1 (8.2%. LCI was the most reproducible ventilation inhomogeneity index. CONCLUSION: The lung model generates lung volumes under the conditions encountered during clinical MBW testing and enables realistic validation of MBW systems. The new N(2MBW system reliably measures lung volumes and delivers reproducible LCI values.

  13. Water or realistic compositions in proton radiotherapy? An analytical study.

    Science.gov (United States)

    Rasouli, Fatemeh S; Masoudi, S Farhad

    2017-03-01

    Pre-clinical tests and simulation studies for radiotherapy are generally carried out using water or simplified materials. Investigating the effects of defining compositionally realistic media in proton transport studies was the objective of this work. Accurate modeling of the Bragg curve is a fundamental requirement for such a study. An equation previously validated by experiments provides an appropriate analytical method for proton dose calculation in depth of the target. Owing to the dependency on protons ranges and the probability of undergoing non-elastic nuclear interactions (NNI), this formula comprises three parameters with values specified for initial proton energy and for the target material. As a result, knowledge of the depth-dose distribution using this analytical model is limited to the materials for which the data has been provided in nuclear data tables. In this study, we used our general formulas for calculating the protons ranges and the probability of undergoing NNI in desired compounds and mixtures with an arbitrary number of constituent elements. Furthermore, the protons dose distribution in the depth of these targets was leading off with determining the parameters appeared in the employed model using our mathematically easy to handle relations. For a number of tissues which may be of interest in proton radiotherapy studies but are missing in reference data tables, the mentioned parameters were calculated. Moreover, the resultant values for the protons ranges and the probability of undergoing NNIs were compared with those in water. The results showed that the differences between the position of Bragg peaks in water and realistic media considered in this study were energy dependent, and ranged between a few millimeters. For proton beams of arbitrary chosen initial energies, the maximum dose delivered to the realistic media varied between about -0.02-4.42% in comparison with that to water. The effects observed (both in penetration and in the

  14. A simple simulation approach to generate complex rainfall fields conditioned by elevation: example of the eastern Mediterranean region

    Science.gov (United States)

    Oriani, Fabio; Ohana-Levi, Noa; Straubhaar, Julien; Renard, Philippe; Karnieli, Arnon; Mariethoz, Grégoire; Morin, Efrat; Marra, Francesco

    2016-04-01

    Stochastically generating realistic rainfall fields is useful to study the uncertainty related to catchment recharge and its propagation to distributed hydrological models. To this end, it is critical to use weather radar images as training data, being the single most informative source for rainfall spatial heterogeneity. Generating realistic simulations is particularly important in regions like the eastern Mediterranean, where the synoptic conditions can lead to rainfall fields presenting various morphology, anisotropy and non-stationarity. The Direct Sampling (DS) technique [Mariethoz2010] is proposed here as a stochastic generator of spatial daily rainfall fields relying on the simulation of radar imagery. The technique is based on resampling of a training data set (in this case, a stack of radar images) and the generation of similar patterns to the ones found in the data. The strong point of DS, which makes it an attractive simulation approach for rainfall, is its capability to preserve the high-order statistical features present in the training image (e.g., rainfall cell shape, spatial non-stationarity) with minimal parameterization. Moreover, factors influencing rainfall, like elevation, can be used as conditioning variables, without the need of a complex statistical dependence model. A DS setup for radar image simulation is presented and tested for the simulation of daily rainfall fields using a 10-year radar-image record from the central region of Israel. Using a synoptic weather classification to train the model, the algorithm can generate realistic spatial fields for different rainfall types, preserving the variability and the covariance structure of the reference reasonably well. Moreover, the simulation is conditioned using the digital elevation model to preserve the complex relation between rainfall intensity and altitude that is characteristic for this region. [Mariethoz2010] G. Mariethoz, P. Renard, and J. Straubhaar. The direct sampling method to

  15. Explorations in statistics: statistical facets of reproducibility.

    Science.gov (United States)

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  16. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  17. Towards a Realistic Model for Failure Propagation in Interdependent Networks

    CERN Document Server

    Sturaro, Agostino; Conti, Mauro; Das, Sajal K

    2015-01-01

    Modern networks are becoming increasingly interdependent. As a prominent example, the smart grid is an electrical grid controlled through a communications network, which in turn is powered by the electrical grid. Such interdependencies create new vulnerabilities and make these networks more susceptible to failures. In particular, failures can easily spread across these networks due to their interdependencies, possibly causing cascade effects with a devastating impact on their functionalities. In this paper we focus on the interdependence between the power grid and the communications network, and propose a novel realistic model, HINT (Heterogeneous Interdependent NeTworks), to study the evolution of cascading failures. Our model takes into account the heterogeneity of such networks as well as their complex interdependencies. We compare HINT with previously proposed models both on synthetic and real network topologies. Experimental results show that existing models oversimplify the failure evolution and network...

  18. DEVELOPMENT OF REALISTIC QUALITY LOSS FUNCTIONS FOR INDUSTRIAL APPLICATIONS

    Institute of Scientific and Technical Information of China (English)

    Abdul-Baasit SHAIBU; Byung Rae CHO

    2006-01-01

    A number of quality loss functions, most recently the Taguchi loss function, have been developed to quantify the loss due to the deviation of product performance from the desired target value. All these loss functions assume the same loss at the specified specification limits. In many real life industrial applications, however, the losses at the two different specifications limits are often not the same.Further, current loss functions assume a product should be reworked or scrapped if product performance falls outside the specification limits. It is a common practice in many industries to replace a defective item rather than spending resources to repair it, especially if considerable amount of time is required. To rectify these two potential problems, this paper proposes more realistic quality loss functions for proper applications to real-world industrial problems. This paper also carries out a comparison studies of all the loss functions it considers.

  19. Realistic Mobility Modeling for Vehicular Ad Hoc Networks

    Science.gov (United States)

    Akay, Hilal; Tugcu, Tuna

    2009-08-01

    Simulations used for evaluating the performance of routing protocols for Vehicular Ad Hoc Networks (VANET) are mostly based on random mobility and fail to consider individual behaviors of the vehicles. Unrealistic assumptions about mobility produce misleading results about the behavior of routing protocols in real deployments. In this paper, a realistic mobility modeling tool, Mobility for Vehicles (MOVE), which considers the basic mobility behaviors of vehicles, is proposed for a more accurate evaluation. The proposed model is tested against the Random Waypoint (RWP) model using AODV and OLSR protocols. The results show that the mobility model significantly affects the number of nodes within the transmission range of a node, the volume of control traffic, and the number of collisions. It is shown that number of intersections, grid size, and node density are important parameters when dealing with VANET performance.

  20. Hiding a Realistic Object Using a Broadband Terahertz Invisibility Cloak

    CERN Document Server

    Zhou, Fan; Cao, Wei; Stuart, Colin T; Gu, Jianqiang; Zhang, Weili; Sun, Cheng

    2011-01-01

    The invisibility cloak has been a long-standing dream for many researchers over the decades. The introduction of transformational optics has revitalized this field by providing a general method to design material distributions to hide the subject from detection. By transforming space and light propagation, a three-dimensional (3D) object is perceived as having a reduced number of dimensions, in the form of points, lines, and thin sheets, making it "undetectable" judging from the scattered field. Although a variety of cloaking devices have been reported at microwave and optical frequencies, the spectroscopically important Terahertz (THz) domain remains unexplored. Moreover, due to the difficulties in fabricating cloaking devices that are optically large in all three dimensions, hiding realistic 3D objects has yet to be demonstrated. Here, we report the first experimental demonstration of a 3D THz cloaking device fabricated using a scalable Projection Microstereolithography process. The cloak operates at a broa...