Simple and Realistic Data Generation
DEFF Research Database (Denmark)
Pedersen, Kenneth Houkjær; Torp, Kristian; Wind, Rico
2006-01-01
This paper presents a generic, DBMS independent, and highly extensible relational data generation tool. The tool can efficiently generate realistic test data for OLTP, OLAP, and data streaming applications. The tool uses a graph model to direct the data generation. This model makes it very simple...... to generate data even for large database schemas with complex inter- and intra table relationships. The model also makes it possible to generate data with very accurate characteristics....
Generating realistic images using Kray
Tanski, Grzegorz
2004-07-01
Kray is an application for creating realistic images. It is written in C++ programming language, has a text-based interface, solves global illumination problem using techniques such as radiosity, path tracing and photon mapping.
RenderGAN: Generating Realistic Labeled Data
Directory of Open Access Journals (Sweden)
Leon Sixt
2018-06-01
Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.
Survey of Approaches to Generate Realistic Synthetic Graphs
Energy Technology Data Exchange (ETDEWEB)
Lim, Seung-Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shankar, Mallikarjun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Imam, Neena [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-10-01
A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broad set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.
Realistic generation cost of solar photovoltaic electricity
International Nuclear Information System (INIS)
Singh, Parm Pal; Singh, Sukhmeet
2010-01-01
Solar photovoltaic (SPV) power plants have long working life with zero fuel cost and negligible maintenance cost but requires huge initial investment. The generation cost of the solar electricity is mainly the cost of financing the initial investment. Therefore, the generation cost of solar electricity in different years depends on the method of returning the loan. Currently levelized cost based on equated payment loan is being used. The static levelized generation cost of solar electricity is compared with the current value of variable generation cost of grid electricity. This improper cost comparison is inhibiting the growth of SPV electricity by creating wrong perception that solar electricity is very expensive. In this paper a new method of loan repayment has been developed resulting in generation cost of SPV electricity that increases with time like that of grid electricity. A generalized capital recovery factor has been developed for graduated payment loan in which capital and interest payment in each installment are calculated by treating each loan installment as an independent loan for the relevant years. Generalized results have been calculated which can be used to determine the cost of SPV electricity for a given system at different places. Results show that for SPV system with specific initial investment of 5.00 cents /kWh/year, loan period of 30 years and loan interest rate of 4% the levelized generation cost of SPV electricity with equated payment loan turns out to be 28.92 cents /kWh, while the corresponding generation cost with graduated payment loan with escalation in annual installment of 8% varies from 9.51 cents /kWh in base year to 88.63 cents /kWh in 30th year. So, in this case, the realistic current generation cost of SPV electricity is 9.51 cents /kWh and not 28.92 cents /kWh. Further, with graduated payment loan, extension in loan period results in sharp decline in cost of SPV electricity in base year. Hence, a policy change is required
Interferometric data modelling: issues in realistic data generation
International Nuclear Information System (INIS)
Mukherjee, Soma
2004-01-01
This study describes algorithms developed for modelling interferometric noise in a realistic manner, i.e. incorporating non-stationarity that can be seen in the data from the present generation of interferometers. The noise model is based on individual component models (ICM) with the application of auto regressive moving average (ARMA) models. The data obtained from the model are vindicated by standard statistical tests, e.g. the KS test and Akaike minimum criterion. The results indicate a very good fit. The advantage of using ARMA for ICMs is that the model parameters can be controlled and hence injection and efficiency studies can be conducted in a more controlled environment. This realistic non-stationary noise generator is intended to be integrated within the data monitoring tool framework
Generating realistic roofs over a rectilinear polygon
Ahn, Heekap
2011-01-01
Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs, and show a connection with the straight skeleton of P. We show that the maximum possible number of distinct realistic roofs over P is ( ⌊(n-4)/4⌋ (n-4)/2) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n 4) preprocessing time. We also present an O(n 5)-time algorithm for computing a realistic roof with minimum height or volume. © 2011 Springer-Verlag.
Generating realistic environments for cyber operations development, testing, and training
Berk, Vincent H.; Gregorio-de Souza, Ian; Murphy, John P.
2012-06-01
Training eective cyber operatives requires realistic network environments that incorporate the structural and social complexities representative of the real world. Network trac generators facilitate repeatable experiments for the development, training and testing of cyber operations. However, current network trac generators, ranging from simple load testers to complex frameworks, fail to capture the realism inherent in actual environments. In order to improve the realism of network trac generated by these systems, it is necessary to quantitatively measure the level of realism in generated trac with respect to the environment being mimicked. We categorize realism measures into statistical, content, and behavioral measurements, and propose various metrics that can be applied at each level to indicate how eectively the generated trac mimics the real world.
Generating realistic roofs over a rectilinear polygon
Ahn, Heekap; Bae, Sangwon; Knauer, Christian; Lee, Mira; Shin, Chansu; Vigneron, Antoine E.
2011-01-01
Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. In this paper, we introduce realistic roofs by imposing
Realistic thermodynamic and statistical-mechanical measures for neural synchronization.
Kim, Sang-Yoon; Lim, Woochang
2014-04-15
Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.
Generating Realistic Labelled, Weighted Random Graphs
Directory of Open Access Journals (Sweden)
Michael Charles Davis
2015-12-01
Full Text Available Generative algorithms for random graphs have yielded insights into the structure and evolution of real-world networks. Most networks exhibit a well-known set of properties, such as heavy-tailed degree distributions, clustering and community formation. Usually, random graph models consider only structural information, but many real-world networks also have labelled vertices and weighted edges. In this paper, we present a generative model for random graphs with discrete vertex labels and numeric edge weights. The weights are represented as a set of Beta Mixture Models (BMMs with an arbitrary number of mixtures, which are learned from real-world networks. We propose a Bayesian Variational Inference (VI approach, which yields an accurate estimation while keeping computation times tractable. We compare our approach to state-of-the-art random labelled graph generators and an earlier approach based on Gaussian Mixture Models (GMMs. Our results allow us to draw conclusions about the contribution of vertex labels and edge weights to graph structure.
International Nuclear Information System (INIS)
Kangas, H.
2001-01-01
The frost in February increased the power demand in Finland significantly. The total power consumption in Finland during January-February 2001 was about 4% higher than a year before. In January 2001 the average temperature in Finland was only about - 4 deg C, which is nearly 2 degrees higher than in 2000 and about 6 degrees higher than long term average. Power demand in January was slightly less than 7.9 TWh, being about 0.5% less than in 2000. The power consumption in Finland during the past 12 months exceeded 79.3 TWh, which is less than 2% higher than during the previous 12 months. In February 2001 the average temperature was - 10 deg C, which was about 5 degrees lower than in February 2000. Because of this the power consumption in February 2001 increased by 5%. Power consumption in February was 7.5 TWh. The maximum hourly output of power plants in Finland was 13310 MW. Power consumption of Finnish households in February 2001 was about 10% higher than in February 2000, and in industry the increase was nearly zero. The utilization rate in forest industry in February 2001 decreased from the value of February 2000 by 5%, being only about 89%. The power consumption of the past 12 months (Feb. 2000 - Feb. 2001) was 79.6 TWh. Generation of hydroelectric power in Finland during January - February 2001 was 10% higher than a year before. The generation of hydroelectric power in Jan. - Feb. 2001 was nearly 2.7 TWh, corresponding to 17% of the power demand in Finland. The output of hydroelectric power in Finland during the past 12 months was 14.7 TWh. The increase from the previous 12 months was 17% corresponding to over 18% of the power demand in Finland. Wind power generation in Jan. - Feb. 2001 was exceeded slightly 10 GWh, while in 2000 the corresponding output was 20 GWh. The degree of utilization of Finnish nuclear power plants in Jan. - Feb. 2001 was high. The output of these plants was 3.8 TWh, being about 1% less than in Jan. - Feb. 2000. The main cause for the
Optimizing Wind And Hydropower Generation Within Realistic Reservoir Operating Policy
Magee, T. M.; Clement, M. A.; Zagona, E. A.
2012-12-01
Previous studies have evaluated the benefits of utilizing the flexibility of hydropower systems to balance the variability and uncertainty of wind generation. However, previous hydropower and wind coordination studies have simplified non-power constraints on reservoir systems. For example, some studies have only included hydropower constraints on minimum and maximum storage volumes and minimum and maximum plant discharges. The methodology presented here utilizes the pre-emptive linear goal programming optimization solver in RiverWare to model hydropower operations with a set of prioritized policy constraints and objectives based on realistic policies that govern the operation of actual hydropower systems, including licensing constraints, environmental constraints, water management and power objectives. This approach accounts for the fact that not all policy constraints are of equal importance. For example target environmental flow levels may not be satisfied if it would require violating license minimum or maximum storages (pool elevations), but environmental flow constraints will be satisfied before optimizing power generation. Additionally, this work not only models the economic value of energy from the combined hydropower and wind system, it also captures the economic value of ancillary services provided by the hydropower resources. It is recognized that the increased variability and uncertainty inherent with increased wind penetration levels requires an increase in ancillary services. In regions with liberalized markets for ancillary services, a significant portion of hydropower revenue can result from providing ancillary services. Thus, ancillary services should be accounted for when determining the total value of a hydropower system integrated with wind generation. This research shows that the end value of integrated hydropower and wind generation is dependent on a number of factors that can vary by location. Wind factors include wind penetration level
Caple, Jodi; Stephan, Carl N
2017-05-01
Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.
A realistic way for graduating from nuclear power generation
International Nuclear Information System (INIS)
Kikkawa, Takeo
2012-01-01
After Fukushima Daiichi Nuclear Power Plant accident, fundamental reform of Japanese energy policy was under way. As for reform of power generation share for the future, nuclear power share should be decided by three independent elements of the progress: (1) extension of power generation using renewable energy, (2) reduction of power usage by electricity saving and (3) technical innovation toward zero emission of coal-fired thermal power. In 2030, nuclear power share would still remain about 20% obtained by the 'subtraction' but in the long run nuclear power would be shutdown judging from difficulties in solution of backend problems of spent fuel disposal. (T. Tanaka)
Realistic generation of natural phenomena based on video synthesis
Wang, Changbo; Quan, Hongyan; Li, Chenhui; Xiao, Zhao; Chen, Xiao; Li, Peng; Shen, Liuwei
2009-10-01
Research on the generation of natural phenomena has many applications in special effects of movie, battlefield simulation and virtual reality, etc. Based on video synthesis technique, a new approach is proposed for the synthesis of natural phenomena, including flowing water and fire flame. From the fire and flow video, the seamless video of arbitrary length is generated. Then, the interaction between wind and fire flame is achieved through the skeleton of flame. Later, the flow is also synthesized by extending the video textures using an edge resample method. Finally, we can integrate the synthesized natural phenomena into a virtual scene.
StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks
Zhang, Han; Xu, Tao; Li, Hongsheng; Zhang, Shaoting; Wang, Xiaogang; Huang, Xiaolei; Metaxas, Dimitris
2017-01-01
Although Generative Adversarial Networks (GANs) have shown remarkable success in various tasks, they still face challenges in generating high quality images. In this paper, we propose Stacked Generative Adversarial Networks (StackGAN) aiming at generating high-resolution photo-realistic images. First, we propose a two-stage generative adversarial network architecture, StackGAN-v1, for text-to-image synthesis. The Stage-I GAN sketches the primitive shape and colors of the object based on given...
A Low-cost System for Generating Near-realistic Virtual Actors
Afifi, Mahmoud; Hussain, Khaled F.; Ibrahim, Hosny M.; Omar, Nagwa M.
2015-06-01
Generating virtual actors is one of the most challenging fields in computer graphics. The reconstruction of a realistic virtual actor has been paid attention by the academic research and the film industry to generate human-like virtual actors. Many movies were acted by human-like virtual actors, where the audience cannot distinguish between real and virtual actors. The synthesis of realistic virtual actors is considered a complex process. Many techniques are used to generate a realistic virtual actor; however they usually require expensive hardware equipment. In this paper, a low-cost system that generates near-realistic virtual actors is presented. The facial features of the real actor are blended with a virtual head that is attached to the actor's body. Comparing with other techniques that generate virtual actors, the proposed system is considered a low-cost system that requires only one camera that records the scene without using any expensive hardware equipment. The results of our system show that the system generates good near-realistic virtual actors that can be used on many applications.
Complete methodology on generating realistic wind speed profiles based on measurements
DEFF Research Database (Denmark)
Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan
2012-01-01
, wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations...
Using Microsoft Excel to Generate Usage Statistics
Spellman, Rosemary
2011-01-01
At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…
First-Generation Transgenic Plants and Statistics
Nap, Jan-Peter; Keizer, Paul; Jansen, Ritsert
1993-01-01
The statistical analyses of populations of first-generation transgenic plants are commonly based on mean and variance and generally require a test of normality. Since in many cases the assumptions of normality are not met, analyses can result in erroneous conclusions. Transformation of data to
Statistical analysis of next generation sequencing data
Nettleton, Dan
2014-01-01
Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...
DEFF Research Database (Denmark)
Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob
2014-01-01
In order to provide a vehicle fleet that realistically represents the predicted Electric Vehicle (EV) penetration for the future, a model is required that mimics people driving behaviour rather than simply playing back collected data. When the focus is broadened from on a traditional user...... scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
International Nuclear Information System (INIS)
Frepoli, Cesare; Oriani, Luca
2006-01-01
In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)
Model-generated air quality statistics for application in vegetation response models in Alberta
International Nuclear Information System (INIS)
McVehil, G.E.; Nosal, M.
1990-01-01
To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models
Vermeeren, Günter; Joseph, Wout; Martens, Luc
2013-04-01
Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.
Kolmogorov complexity, pseudorandom generators and statistical models testing
Czech Academy of Sciences Publication Activity Database
Šindelář, Jan; Boček, Pavel
2002-01-01
Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002
Realistic analysis of steam generator tube rupture accident in Angra-1 reactor
International Nuclear Information System (INIS)
Fontes, S.W.F.
1989-01-01
This paper presents the analysis of different scenarios for a Steam Generator Tube Rupture accident (SGTR) in Angra-1 NPP. The results and conclusions will be used as support in the preparation of the emergency situation programs for the plant. For the analysis a SGTR simulation was performed with RETRAN-02 code. The results indicated that the core integrity and the plant itself will not affect by small ruptures in SG tubes. For large ruptures the analysis demonstrated that the accident may have harmful consequences if the operator do not actuate effectively since the initial moments of the accidents. (author) [pt
Toward the realistic three-generation model in the (2,0) heterotic string compactification
International Nuclear Information System (INIS)
Asatryan, H.M.; Murayama, A.
1992-01-01
In this paper, the three generation models with SUSY SO(10) or SU(5) GUTs derived from the (2,0) compactification of E 8 x E' 8 heterotic string, the massless matter field spectra at the GUT scale M X and the breaking directions of GUT symmetries are discussed. A pseudo-left-right symmetric Pati-Salam model is naturally deduced in the SUSY SO(10) GUT and shown to have an interesting property, M x ≅ M P1 , M R ≅ 10 10 GeV and M S ( the scale of superpartner masses) ≅ 10 4 GeV, as a result of the renormalization group equation analysis using the new precise LEP data
Primordial statistical anisotropy generated at the end of inflation
International Nuclear Information System (INIS)
Yokoyama, Shuichiro; Soda, Jiro
2008-01-01
We present a new mechanism for generating primordial statistical anisotropy of curvature perturbations. We introduce a vector field which has a non-minimal kinetic term and couples with a waterfall field in a hybrid inflation model. In such a system, the vector field gives fluctuations of the end of inflation and hence induces a subcomponent of curvature perturbations. Since the vector has a preferred direction, the statistical anisotropy could appear in the fluctuations. We present the explicit formula for the statistical anisotropy in the primordial power spectrum and the bispectrum of curvature perturbations. Interestingly, there is the possibility that the statistical anisotropy does not appear in the power spectrum but does appear in the bispectrum. We also find that the statistical anisotropy provides the shape dependence to the bispectrum
Primordial statistical anisotropy generated at the end of inflation
Energy Technology Data Exchange (ETDEWEB)
Yokoyama, Shuichiro [Department of Physics and Astrophysics, Nagoya University, Aichi 464-8602 (Japan); Soda, Jiro, E-mail: shu@a.phys.nagoya-u.ac.jp, E-mail: jiro@tap.scphys.kyoto-u.ac.jp [Department of Physics, Kyoto University, Kyoto 606-8501 (Japan)
2008-08-15
We present a new mechanism for generating primordial statistical anisotropy of curvature perturbations. We introduce a vector field which has a non-minimal kinetic term and couples with a waterfall field in a hybrid inflation model. In such a system, the vector field gives fluctuations of the end of inflation and hence induces a subcomponent of curvature perturbations. Since the vector has a preferred direction, the statistical anisotropy could appear in the fluctuations. We present the explicit formula for the statistical anisotropy in the primordial power spectrum and the bispectrum of curvature perturbations. Interestingly, there is the possibility that the statistical anisotropy does not appear in the power spectrum but does appear in the bispectrum. We also find that the statistical anisotropy provides the shape dependence to the bispectrum.
International Nuclear Information System (INIS)
Upadhyay, Ranjit Kumar; Kumari, Nitu; Rai, Vikas
2009-01-01
We show that wave of chaos (WOC) can generate two-dimensional time-independent spatial patterns which can be a potential candidate for understanding planktonic patchiness observed in marine environments. These spatio-temporal patterns were obtained in computer simulations of a minimal model of phytoplankton-zooplankton dynamics driven by forces of diffusion. We also attempt to figure out the average lifetimes of these non-linear non-equilibrium patterns. These spatial patterns serve as a realistic model for patchiness found in aquatic systems (e.g., marine and oceanic). Additionally, spatio-temporal chaos produced by bi-directional WOCs is robust to changes in key parameters of the system; e.g., intra-specific competition among individuals of phytoplankton and the rate of fish predation. The ideas contained in the present paper may find applications in diverse fields of human endeavor.
Narita, Akihiro; Ohkubo, Masaki; Murao, Kohei; Matsumoto, Toru; Wada, Shinichi
2017-10-01
The aim of this feasibility study using phantoms was to propose a novel method for obtaining computer-generated realistic virtual nodules in lung computed tomography (CT). In the proposed methodology, pulmonary nodule images obtained with a CT scanner are deconvolved with the point spread function (PSF) in the scan plane and slice sensitivity profile (SSP) measured for the scanner; the resultant images are referred to as nodule-like object functions. Next, by convolving the nodule-like object function with the PSF and SSP of another (target) scanner, the virtual nodule can be generated so that it has the characteristics of the spatial resolution of the target scanner. To validate the methodology, the authors applied physical nodules of 5-, 7- and 10-mm-diameter (uniform spheres) included in a commercial CT test phantom. The nodule-like object functions were calculated from the sphere images obtained with two scanners (Scanner A and Scanner B); these functions were referred to as nodule-like object functions A and B, respectively. From these, virtual nodules were generated based on the spatial resolution of another scanner (Scanner C). By investigating the agreement of the virtual nodules generated from the nodule-like object functions A and B, the equivalence of the nodule-like object functions obtained from different scanners could be assessed. In addition, these virtual nodules were compared with the real (true) sphere images obtained with Scanner C. As a practical validation, five types of laboratory-made physical nodules with various complicated shapes and heterogeneous densities, similar to real lesions, were used. The nodule-like object functions were calculated from the images of these laboratory-made nodules obtained with Scanner A. From them, virtual nodules were generated based on the spatial resolution of Scanner C and compared with the real images of laboratory-made nodules obtained with Scanner C. Good agreement of the virtual nodules generated from
Energy Technology Data Exchange (ETDEWEB)
Brenner, David J.
2009-07-21
The 16th ASA Conference on Radiation and Health, held June 27-30, 2004 in Beaver Creek, CO, offered a unique forum for discussing research related to the effects of radiation exposures on human health in a multidisciplinary setting. The Conference furnishes investigators in health related disciplines the opportunity to learn about new quantitative approaches to their problems and furnishes statisticians the opportunity to learn about new applications for their discipline. The Conference was attended by about 60 scientists including statisticians, epidemiologists, biologists and physicists interested in radiation research. For the first time, ten recipients of Young Investigator Awards participated in the conference. The Conference began with a debate on the question: “Do radiation doses below 1 cGy increase cancer risks?” The keynote speaker was Dr. Martin Lavin, who gave a banquet presentation on the timely topic “How important is ATM?” The focus of the 2004 Conference on Radiation and Health was Radiation in Realistic Environments: Interactions Between Radiation and Other Risk Modifiers. The sessions of the conference included: Radiation, Smoking, and Lung Cancer Interactions of Radiation with Genetic Factors: ATM Radiation, Genetics, and Epigenetics Radiotherapeutic Interactions The Conference on Radiation and Health is held bi-annually, and participants are looking forward to the 17th conference to be held in 2006.
International Nuclear Information System (INIS)
Wallin, K.; Voskamp, R.; Schmibauer, J.; Ostermeyer, H.; Nagel, G.
2011-01-01
The cost of steam generator inspections in nuclear power plants is high. A new quantitative assessment methodology for the accumulation of flaws due to stochastic causes like fretting has been developed for cases where limited inspection data is available. Additionally, a new quantitative assessment methodology for the accumulation of environment related flaws, caused e.g. by corrosion in steam generator tubes, has been developed. The method that combines deterministic information regarding flaw initiation and growth with stochastic elements connected to environmental aspects requires only knowledge of the experimental flaw accumulation history. The method, combining both types of flaw types, provides a complete description of the flaw accumulation and there are several possible uses of the method. The method can be used to evaluate the total life expectancy of the steam generator and simple statistically defined plugging criteria can be established based on flaw behaviour. This way the inspection interval and inspection coverage can be optimized with respect to allowable flaws and the method can recognize flaw type subsets requiring more frequent inspection intervals. The method can also be used to develop statistically realistic safety factors accounting for uncertainties in inspection flaw sizing and detection. The statistical assessment method has been showed to be robust and insensitive to different assessments of plugged tubes. Because the procedure is re-calibrated after each inspection, it reacts effectively to possible changes in the steam generator environment. Validation of the assessment method is provided for real steam generators, both in the case of stochastic damage as well as environment related flaws. (authors)
Quantum Statistical Testing of a Quantum Random Number Generator
Energy Technology Data Exchange (ETDEWEB)
Humble, Travis S [ORNL
2014-01-01
The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.
Automated robust generation of compact 3D statistical shape models
Vrtovec, Tomaz; Likar, Bostjan; Tomazevic, Dejan; Pernus, Franjo
2004-05-01
Ascertaining the detailed shape and spatial arrangement of anatomical structures is important not only within diagnostic settings but also in the areas of planning, simulation, intraoperative navigation, and tracking of pathology. Robust, accurate and efficient automated segmentation of anatomical structures is difficult because of their complexity and inter-patient variability. Furthermore, the position of the patient during image acquisition, the imaging device and protocol, image resolution, and other factors induce additional variations in shape and appearance. Statistical shape models (SSMs) have proven quite successful in capturing structural variability. A possible approach to obtain a 3D SSM is to extract reference voxels by precisely segmenting the structure in one, reference image. The corresponding voxels in other images are determined by registering the reference image to each other image. The SSM obtained in this way describes statistically plausible shape variations over the given population as well as variations due to imperfect registration. In this paper, we present a completely automated method that significantly reduces shape variations induced by imperfect registration, thus allowing a more accurate description of variations. At each iteration, the derived SSM is used for coarse registration, which is further improved by describing finer variations of the structure. The method was tested on 64 lumbar spinal column CT scans, from which 23, 38, 45, 46 and 42 volumes of interest containing vertebra L1, L2, L3, L4 and L5, respectively, were extracted. Separate SSMs were generated for each vertebra. The results show that the method is capable of reducing the variations induced by registration errors.
Sherman, Christopher Scott
Naturally occurring geologic heterogeneity is an important, but often overlooked, aspect of seismic wave propagation. This dissertation presents a strategy for modeling the effects of heterogeneity using a combination of geostatistics and Finite Difference simulation. In the first chapter, I discuss my motivations for studying geologic heterogeneity and seis- mic wave propagation. Models based upon fractal statistics are powerful tools in geophysics for modeling heterogeneity. The important features of these fractal models are illustrated using borehole log data from an oil well and geomorphological observations from a site in Death Valley, California. A large part of the computational work presented in this disserta- tion was completed using the Finite Difference Code E3D. I discuss the Python-based user interface for E3D and the computational strategies for working with heterogeneous models developed over the course of this research. The second chapter explores a phenomenon observed for wave propagation in heteroge- neous media - the generation of unexpected shear wave phases in the near-source region. In spite of their popularity amongst seismic researchers, approximate methods for modeling wave propagation in these media, such as the Born and Rytov methods or Radiative Trans- fer Theory, are incapable of explaining these shear waves. This is primarily due to these method's assumptions regarding the coupling of near-source terms with the heterogeneities and mode conversion. To determine the source of these shear waves, I generate a suite of 3D synthetic heterogeneous fractal geologic models and use E3D to simulate the wave propaga- tion for a vertical point force on the surface of the models. I also present a methodology for calculating the effective source radiation patterns from the models. The numerical results show that, due to a combination of mode conversion and coupling with near-source hetero- geneity, shear wave energy on the order of 10% of the
statistical analysis of wind speed for electrical power generation
African Journals Online (AJOL)
HOD
sites are suitable for the generation of electrical energy. Also, the results ... Nigerian Journal of Technology (NIJOTECH). Vol. 36, No. ... parameter in the wind-power generation system. ..... [3] A. Zaharim, A. M Razali, R. Z Abidin, and K Sopian,.
Romanteau, T; Collard, Caroline; Debraine, A; Decotigny, D; Dobrzynski, L; Karar, A; Regnault, N
2005-01-01
The CMS [1] electromagnetic calorimeter (ECAL) [2] uses 3 132 Front-End boards (FE) performing both trigger and data readout functions. Prior to their integration at CERN, the FE boards have to be validated by dedicated test bench systems. The final one, called "XFEST" (eXtended Front-End System Test) and for which the present developments have been performed, is located at Laboratoire Leprince-Ringuet. In this contribution, a solution is described to efficiently test a large set of complex electronics boards characterized by a large number of input ports and a high throughput data rate. To perform it, an algorithm to simulate the Very Front End signals has been emulated. The project firmwares use VHDL embedded into XILINX Field Programmable Gate Array circuits (FPGA). This contribution describes the solutions developed in order to create a realistic digital input patterns real-time emul ator working at 40 MHz. The implementation of a real time comparison of the FE output streams as well as the test bench wil...
Investigating the statistical properties of user-generated documents
Inches, Giacomo; Carman, Mark J.; Crestani, Fabio
2011-01-01
The importance of the Internet as a communication medium is reflected in the large amount of documents being generated every day by users of the different services that take place online. In this work we aim at analyzing the properties of these online user-generated documents for some of the established services over the Internet (Kongregate, Twitter, Myspace and Slashdot) and comparing them with a consolidated collection of standard information retrieval documents (from the Wall Street...
Investigating the Statistical Properties of User-Generated Documents
Inches Giacomo; Carman Mark James
2011-01-01
The importance of the Internet as a communication medium is reflected in the large amount of documents being generated every day by users of the different services that take place online. In this work we aim at analyzing the properties of these online user generated documents for some of the established services over the Internet (Kongregate Twitter Myspace and Slashdot) and comparing them with a consolidated collection of standard information retrieval documents (from the Wall Street Journal...
Physical and statistical models for steam generator clogging diagnosis
Girard, Sylvain
2014-01-01
Clogging of steam generators in nuclear power plants is a highly sensitive issue in terms of performance and safety and this book proposes a completely novel methodology for diagnosing this phenomenon. It demonstrates real-life industrial applications of this approach to French steam generators and applies the approach to operational data gathered from French nuclear power plants. The book presents a detailed review of in situ diagnosis techniques and assesses existing methodologies for clogging diagnosis, whilst examining their limitations. It also addresses numerical modelling of the dynamic
Statistical Approaches for Next-Generation Sequencing Data
Qiao, Dandi
2012-01-01
During the last two decades, genotyping technology has advanced rapidly, which enabled the tremendous success of genome-wide association studies (GWAS) in the search of disease susceptibility loci (DSLs). However, only a small fraction of the overall predicted heritability can be explained by the DSLs discovered. One possible explanation for this ”missing heritability” phenomenon is that many causal variants are rare. The recent development of high-throughput next-generation sequencing (NGS) ...
Steam generators clogging diagnosis through physical and statistical modelling
International Nuclear Information System (INIS)
Girard, S.
2012-01-01
Steam generators are massive heat exchangers feeding the turbines of pressurised water nuclear power plants. Internal parts of steam generators foul up with iron oxides which gradually close some holes aimed for the passing of the fluid. This phenomenon called clogging causes safety issues and means to assess it are needed to optimise the maintenance strategy. The approach investigated in this thesis is the analysis of steam generators dynamic behaviour during power transients with a mono dimensional physical model. Two improvements to the model have been implemented. One was taking into account flows orthogonal to the modelling axis, the other was introducing a slip between phases accounting for velocity difference between liquid water and steam. These two elements increased the model's degrees of freedom and improved the adequacy of the simulation to plant data. A new calibration and validation methodology has been proposed to assess the robustness of the model. The initial inverse problem was ill posed: different clogging spatial configurations can produce identical responses. The relative importance of clogging, depending on its localisation, has been estimated by sensitivity analysis with the Sobol' method. The dimension of the model functional output had been previously reduced by principal components analysis. Finally, the input dimension has been reduced by a technique called sliced inverse regression. Based on this new framework, a new diagnosis methodology, more robust and better understood than the existing one, has been proposed. (author)
Use of MCNP + GADRAS in Generating More Realistic Gamma-Ray Spectra for Plutonium and HEU Objects
International Nuclear Information System (INIS)
Rawool-Sullivan, Mohini; Mattingly, John; Mitchell, Dean
2012-01-01
The ability to accurately simulate high-resolution gamma spectra from materials that emit both neutrons and gammas is very important to the analysis of special nuclear materials (SNM), e.g., uranium and plutonium. One approach under consideration has been to combine MCNP and GADRAS. This approach is expected to generate more accurate gamma ray spectra for complex three-dimensional geometries than can be obtained from one-dimensional deterministic transport simulations (e.g., ONEDANT). This presentation describes application of combining MCNP and GADRAS in simulating plutonium and uranium spectra.
Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach
DEFF Research Database (Denmark)
Wan, Can; Lin, Jin; Song, Yonghua
2017-01-01
This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...
Margin improvement initiatives: realistic approaches
Energy Technology Data Exchange (ETDEWEB)
Chan, P.K.; Paquette, S. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Cunning, T.A. [Department of National Defence, Ottawa, ON (Canada); French, C.; Bonin, H.W. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Pandey, M. [Univ. of Waterloo, Waterloo, ON (Canada); Murchie, M. [Cameco Fuel Manufacturing, Port Hope, ON (Canada)
2014-07-01
With reactor core aging, safety margins are particularly tight. Two realistic and practical approaches are proposed here to recover margins. The first project is related to the use of a small amount of neutron absorbers in CANDU Natural Uranium (NU) fuel bundles. Preliminary results indicate that the fuelling transient and subsequent reactivity peak can be lowered to improve the reactor's operating margins, with minimal impact on burnup when less than 1000 mg of absorbers is added to a fuel bundle. The second project involves the statistical analysis of fuel manufacturing data to demonstrate safety margins. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to generate input for ELESTRES and ELOCA. It is found that the fuel response distributions are far below industrial failure limits, implying that margin exists in the current fuel design. (author)
Statistical evaluation of PACSTAT random number generation capabilities
Energy Technology Data Exchange (ETDEWEB)
Piepel, G.F.; Toland, M.R.; Harty, H.; Budden, M.J.; Bartley, C.L.
1988-05-01
This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT were implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.
A methodology to generate statistically dependent wind speed scenarios
Energy Technology Data Exchange (ETDEWEB)
Morales, J.M.; Conejo, A.J. [Department of Electrical Engineering, Univ. Castilla - La Mancha, Campus Universitario s/n, 13071 Ciudad Real (Spain); Minguez, R. [Environmental Hydraulics Institute ' ' IH Cantabria' ' , Univ. Cantabria, Avenida de los Castros s/n, 39005 Santander (Spain)
2010-03-15
Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn. (author)
A methodology to generate statistically dependent wind speed scenarios
International Nuclear Information System (INIS)
Morales, J.M.; Minguez, R.; Conejo, A.J.
2010-01-01
Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn.
International Nuclear Information System (INIS)
Miller, A.I.; Duffey, R.B.
2005-01-01
Can electricity from high-capacity nuclear reactors be blended with the variable output of wind turbines to produce electrolytic hydrogen competitively? To be competitive with alternative sources, hydrogen produced by conventional electrolysis requires low-cost electricity (likely <2.5 cents US/kW.h). One approach is to operate interruptibly, allowing an installation to sell electricity when the grid price is high and to make hydrogen when it is low. Our previous studies show that this could be cost-competitive using nuclear power generator producing electricity around 3 cents US/kW.h. Although similar unit costs are projected for wind-generated electricity, idleness of the electrolysis facility due to the variability of wind-generated electricity imposes a significant cost penalty. This paper reports on ongoing work on the economics of blending electricity from nuclear and wind sources by using wind-generated power, when available, to augment the current through electrolysis equipment that is primarily nuclear-powered - a concept we call NuWind. A voltage penalty accompanies the higher current. A 10% increase in capital cost for electrolysis equipment to enable it to accommodate the higher rate of hydrogen generation is still substantially cheaper than the capital cost of wind-dedicated electrolysis. Real-time data for electricity costs have been combined with real-time wind variability. The variability in wind fields between sites was accommodated by assigning average wind speeds that produced an average electricity generation from wind of between 32 and 42% of peak capacity, which is typical of the expectations for superior wind-generation sites. (author)
Directory of Open Access Journals (Sweden)
Ferdinand C. Mukumbang
2017-05-01
Full Text Available Abstract Background Poor retention in care and non-adherence to antiretroviral therapy (ART continue to undermine the success of HIV treatment and care programmes across the world. There is a growing recognition that multifaceted interventions – application of two or more adherence-enhancing strategies – may be useful to improve ART adherence and retention in care among people living with HIV/AIDS. Empirical evidence shows that multifaceted interventions produce better results than interventions based on a singular perspective. Nevertheless, the bundle of mechanisms by which multifaceted interventions promote ART adherence are poorly understood. In this paper, we reviewed theories on ART adherence to identify candidate/potential mechanisms by which the adherence club intervention works. Methods We searched five electronic databases (PubMed, EBSCOhost, CINAHL, PsycARTICLES and Google Scholar using Medical Subject Headings (MeSH terms. A manual search of citations from the reference list of the studies identified from the electronic databases was also done. Twenty-six articles that adopted a theory-guided inquiry of antiretroviral adherence behaviour were included for the review. Eleven cognitive and behavioural theories underpinning these studies were explored. We examined each theory for possible ‘generative causality’ using the realist evaluation heuristic (Context-Mechanism-Outcome configuration, then, we selected candidate mechanisms thematically. Results We identified three major sets of theories: Information-Motivation-Behaviour, Social Action Theory and Health Behaviour Model, which explain ART adherence. Although they show potential in explaining adherence bebahiours, they fall short in explaining exactly why and how the various elements they outline combine to explain positive or negative outcomes. Candidate mechanisms indentified were motivation, self-efficacy, perceived social support, empowerment, perceived threat, perceived
Mukumbang, Ferdinand C; Van Belle, Sara; Marchal, Bruno; van Wyk, Brian
2017-05-04
Poor retention in care and non-adherence to antiretroviral therapy (ART) continue to undermine the success of HIV treatment and care programmes across the world. There is a growing recognition that multifaceted interventions - application of two or more adherence-enhancing strategies - may be useful to improve ART adherence and retention in care among people living with HIV/AIDS. Empirical evidence shows that multifaceted interventions produce better results than interventions based on a singular perspective. Nevertheless, the bundle of mechanisms by which multifaceted interventions promote ART adherence are poorly understood. In this paper, we reviewed theories on ART adherence to identify candidate/potential mechanisms by which the adherence club intervention works. We searched five electronic databases (PubMed, EBSCOhost, CINAHL, PsycARTICLES and Google Scholar) using Medical Subject Headings (MeSH) terms. A manual search of citations from the reference list of the studies identified from the electronic databases was also done. Twenty-six articles that adopted a theory-guided inquiry of antiretroviral adherence behaviour were included for the review. Eleven cognitive and behavioural theories underpinning these studies were explored. We examined each theory for possible 'generative causality' using the realist evaluation heuristic (Context-Mechanism-Outcome) configuration, then, we selected candidate mechanisms thematically. We identified three major sets of theories: Information-Motivation-Behaviour, Social Action Theory and Health Behaviour Model, which explain ART adherence. Although they show potential in explaining adherence bebahiours, they fall short in explaining exactly why and how the various elements they outline combine to explain positive or negative outcomes. Candidate mechanisms indentified were motivation, self-efficacy, perceived social support, empowerment, perceived threat, perceived benefits and perceived barriers. Although these candidate
Higher-Order Moment Characterisation of Rogue Wave Statistics in Supercontinuum Generation
DEFF Research Database (Denmark)
Sørensen, Simon Toft; Bang, Ole; Wetzel, Benjamin
2012-01-01
The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour.......The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour....
Patch-based generative shape model and MDL model selection for statistical analysis of archipelagos
DEFF Research Database (Denmark)
Ganz, Melanie; Nielsen, Mads; Brandt, Sami
2010-01-01
We propose a statistical generative shape model for archipelago-like structures. These kind of structures occur, for instance, in medical images, where our intention is to model the appearance and shapes of calcifications in x-ray radio graphs. The generative model is constructed by (1) learning ...
Energy Technology Data Exchange (ETDEWEB)
Curley, G. Michael [North American Electric Reliability Corporation (United States); Mandula, Jiri [International Atomic Energy Agency (IAEA)
2008-05-15
The WEC Committee on the Performance of Generating Plant (PGP) has been collecting and analysing power plant performance statistics worldwide for more than 30 years and has produced regular reports, which include examples of advanced techniques and methods for improving power plant performance through benchmarking. A series of reports from the various working groups was issued in 2008. This reference presents the results of Working Group 2 (WG2). WG2's main task is to facilitate the collection and input on an annual basis of power plant performance data (unit-by-unit and aggregated data) into the WEC PGP database. The statistics will be collected for steam, nuclear, gas turbine and combined cycle, hydro and pump storage plant. WG2 will also oversee the ongoing development of the availability statistics database, including the contents, the required software, security issues and other important information. The report is divided into two sections: Thermal generating, combined cycle/co-generation, combustion turbine, hydro and pumped storage unavailability factors and availability statistics; and nuclear power generating units.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
International Nuclear Information System (INIS)
Saito, Toki; Nakajima, Yoshikazu; Sugita, Naohiko; Mitsuishi, Mamoru; Hashizume, Hiroyuki; Kuramoto, Kouichi; Nakashima, Yosio
2011-01-01
Statistical deformable model based two-dimensional/three-dimensional (2-D/3-D) registration is a promising method for estimating the position and shape of patient bone in the surgical space. Since its accuracy depends on the statistical model capacity, we propose a method for accurately generating a statistical bone model from a CT volume. Our method employs the Sphere-Attribute-Image (SAI) and has improved the accuracy of corresponding point search in statistical model generation. At first, target bone surfaces are extracted as SAIs from the CT volume. Then the textures of SAIs are classified to some regions using Maximally-stable-extremal-regions methods. Next, corresponding regions are determined using Normalized cross-correlation (NCC). Finally, corresponding points in each corresponding region are determined using NCC. The application of our method to femur bone models was performed, and worked well in the experiments. (author)
Peng, Fei; Li, Jiao-ting; Long, Min
2015-03-01
To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. © 2014 American Academy of Forensic Sciences.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Algorithm for the generation of nuclear spin species and nuclear spin statistical weights
International Nuclear Information System (INIS)
Balasubramanian, K.
1982-01-01
A set of algorithms for the computer generation of nuclear spin species and nuclear spin statistical weights potentially useful in molecular spectroscopy is developed. These algorithms generate the nuclear spin species from group structures known as generalized character cycle indices (GCCIs). Thus the required input for these algorithms is just the set of all GCCIs for the symmetry group of the molecule which can be computed easily from the character table. The algorithms are executed and illustrated with examples
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Michaela Kreyenfeld; Rembrandt D. Scholz; Frederik Peters; Ines Wlosnewski
2010-01-01
Until 2008, Germany’s vital statistics did not include information on the biological order of each birth. This resulted in a dearth of important demographic indicators, such as the mean age at first birth and the level of childlessness. Researchers have tried to fill this gap by generating order-specific birth rates from survey data, and by combining survey data with vital statistics. This paper takes a different approach by using hospital statistics on births to generate birth order-specific...
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
A testing procedure for wind turbine generators based on the power grid statistical model
DEFF Research Database (Denmark)
Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter
2017-01-01
In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space...
Jacobson generators, Fock representations and statistics of sl(n + 1)
International Nuclear Information System (INIS)
Palev, T.D.; Jeugt, J. van der
2000-10-01
The properties of A-statistics, related to the class of simple Lie algebras sl(n + 1), n is an element of Z + (Palev, T.D.: Preprint JINR E17-10550 (1977); hep-th/9705032), are further investigated. The description of each sl(n + 1) is carried out via generators and their relations (see eq. (2.5)), first introduced by Jacobson. The related Fock spaces W p , p is an element of N, are finite-dimensional irreducible sl(n + 1)-modules. The Pauli principle of the underlying statistics is formulated. In addition the paper contains the following new results: (a) the A-statistics are interpreted as exclusion statistics; (b) within each W p operators B(p) 1 ± ,...,B(p) n ± , proportional to the Jacobson generators, are introduced. It is proved that in an appropriate topology (Definition 2) lim p→∞ B(p) i ± = B i ± , where B i ± are Bose creation and annihilation operators; (c) it is shown that the local statistics of the degenerated hard-core Bose models and of the related Heisenberg spin models is p = I A-statistics. (author)
A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model
Energy Technology Data Exchange (ETDEWEB)
Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-11
This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimated stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.
Generation of statistical scenarios of short-term wind power production
DEFF Research Database (Denmark)
Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd
2007-01-01
Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform...... on the development of the forecast uncertainty through forecast series. This issue is addressed here by describing a method that permits to generate statistical scenarios of wind generation that accounts for the interdependence structure of prediction errors, in plus of respecting predictive distributions of wind...
Meijs, J.W.H.; Bosch, F.G.C.; Peters, M.J.; Lopes da silva, F.H.
1987-01-01
The magnetic field distribution around the head is simulated using a realistically shaped compartment model of the head. The model is based on magnetic resonance images. The 3 compartments describe the brain, the skull and the scalp. The source is represented by a current dipole situated in the
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana
2015-05-01
Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.
Teshima, Tara Lynn; Patel, Vaibhav; Mainprize, James G; Edwards, Glenn; Antonyshyn, Oleh M
2015-07-01
The utilization of three-dimensional modeling technology in craniomaxillofacial surgery has grown exponentially during the last decade. Future development, however, is hindered by the lack of a normative three-dimensional anatomic dataset and a statistical mean three-dimensional virtual model. The purpose of this study is to develop and validate a protocol to generate a statistical three-dimensional virtual model based on a normative dataset of adult skulls. Two hundred adult skull CT images were reviewed. The average three-dimensional skull was computed by processing each CT image in the series using thin-plate spline geometric morphometric protocol. Our statistical average three-dimensional skull was validated by reconstructing patient-specific topography in cranial defects. The experiment was repeated 4 times. In each case, computer-generated cranioplasties were compared directly to the original intact skull. The errors describing the difference between the prediction and the original were calculated. A normative database of 33 adult human skulls was collected. Using 21 anthropometric landmark points, a protocol for three-dimensional skull landmarking and data reduction was developed and a statistical average three-dimensional skull was generated. Our results show the root mean square error (RMSE) for restoration of a known defect using the native best match skull, our statistical average skull, and worst match skull was 0.58, 0.74, and 4.4 mm, respectively. The ability to statistically average craniofacial surface topography will be a valuable instrument for deriving missing anatomy in complex craniofacial defects and deficiencies as well as in evaluating morphologic results of surgery.
Statistical Modeling of Large Wind Plant System's Generation - A Case Study
International Nuclear Information System (INIS)
Sabolic, D.
2014-01-01
This paper presents simplistic, yet very accurate, descriptive statistical models of various static and dynamic parameters of energy output from a large system of wind plants operated by Bonneville Power Administration (BPA), USA. The system's size at the end of 2013 was 4515 MW of installed capacity. The 5-minute readings from the beginning of 2007 to the end of 2013, recorded and published by BPA, were used to derive a number of experimental distributions, which were then used to devise theoretic statistical models with merely one or two parameters. In spite of the simplicity, they reproduced experimental data with great accuracy, which was checked by rigorous tests of goodness-of-fit. Statistical distribution functions were obtained for the following wind generation-related quantities: total generation as percentage of total installed capacity; change in total generation power in 5, 10, 15, 20, 25, 30, 45, and 60 minutes as percentage of total installed capacity; duration of intervals with total generated power, expressed as percentage of total installed capacity, lower than certain pre-specified level. Limitation of total installed wind plant capacity, when it is determined by regulation demand from wind plants, is discussed, too. The models presented here can be utilized in analyses related to power system economics/policy, which is also briefly discussed in the paper. (author).
Poppe, L.J.; Eliason, A.H.; Hastings, M.E.
2004-01-01
Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft
Xiang-Guo, Meng; Hong-Yi, Fan; Ji-Suo, Wang
2018-04-01
This paper proposes a kind of displaced thermal states (DTS) and explores how this kind of optical field emerges using the entangled state representation. The results show that the DTS can be generated by a coherent state passing through a diffusion channel with the diffusion coefficient ϰ only when there exists κ t = (e^{\\hbar ν /kBT} - 1 )^{-1}. Also, its statistical properties, such as mean photon number, Wigner function and entropy, are investigated.
Statistical spatial properties of speckle patterns generated by multiple laser beams
International Nuclear Information System (INIS)
Le Cain, A.; Sajer, J. M.; Riazuelo, G.
2011-01-01
This paper investigates hot spot characteristics generated by the superposition of multiple laser beams. First, properties of speckle statistics are studied in the context of only one laser beam by computing the autocorrelation function. The case of multiple laser beams is then considered. In certain conditions, it is shown that speckles have an ellipsoidal shape. Analytical expressions of hot spot radii generated by multiple laser beams are derived and compared to numerical estimates made from the autocorrelation function. They are also compared to numerical simulations performed within the paraxial approximation. Excellent agreement is found for the speckle width as well as for the speckle length. Application to the speckle patterns generated in the Laser MegaJoule configuration in the zone where all the beams overlap is presented. Influence of polarization on the size of the speckles as well as on their abundance is studied.
Automatic generation of statistical pose and shape models for articulated joints.
Xin Chen; Graham, Jim; Hutchinson, Charles; Muir, Lindsay
2014-02-01
Statistical analysis of motion patterns of body joints is potentially useful for detecting and quantifying pathologies. However, building a statistical motion model across different subjects remains a challenging task, especially for a complex joint like the wrist. We present a novel framework for simultaneous registration and segmentation of multiple 3-D (CT or MR) volumes of different subjects at various articulated positions. The framework starts with a pose model generated from 3-D volumes captured at different articulated positions of a single subject (template). This initial pose model is used to register the template volume to image volumes from new subjects. During this process, the Grow-Cut algorithm is used in an iterative refinement of the segmentation of the bone along with the pose parameters. As each new subject is registered and segmented, the pose model is updated, improving the accuracy of successive registrations. We applied the algorithm to CT images of the wrist from 25 subjects, each at five different wrist positions and demonstrated that it performed robustly and accurately. More importantly, the resulting segmentations allowed a statistical pose model of the carpal bones to be generated automatically without interaction. The evaluation results show that our proposed framework achieved accurate registration with an average mean target registration error of 0.34 ±0.27 mm. The automatic segmentation results also show high consistency with the ground truth obtained semi-automatically. Furthermore, we demonstrated the capability of the resulting statistical pose and shape models by using them to generate a measurement tool for scaphoid-lunate dissociation diagnosis, which achieved 90% sensitivity and specificity.
Cuesta-Frau, David; Miró-Martínez, Pau; Jordán Núñez, Jorge; Oltra-Crespo, Sandra; Molina Picó, Antonio
2017-08-01
This paper evaluates the performance of first generation entropy metrics, featured by the well known and widely used Approximate Entropy (ApEn) and Sample Entropy (SampEn) metrics, and what can be considered an evolution from these, Fuzzy Entropy (FuzzyEn), in the Electroencephalogram (EEG) signal classification context. The study uses the commonest artifacts found in real EEGs, such as white noise, and muscular, cardiac, and ocular artifacts. Using two different sets of publicly available EEG records, and a realistic range of amplitudes for interfering artifacts, this work optimises and assesses the robustness of these metrics against artifacts in class segmentation terms probability. The results show that the qualitative behaviour of the two datasets is similar, with SampEn and FuzzyEn performing the best, and the noise and muscular artifacts are the most confounding factors. On the contrary, there is a wide variability as regards initialization parameters. The poor performance achieved by ApEn suggests that this metric should not be used in these contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Rochelle E. Tractenberg
2016-12-01
Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.
Statistical inference of the generation probability of T-cell receptors from sequence repertoires.
Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G
2012-10-02
Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.
Association testing for next-generation sequencing data using score statistics
DEFF Research Database (Denmark)
Skotte, Line; Korneliussen, Thorfinn Sand; Albrechtsen, Anders
2012-01-01
computationally feasible due to the use of score statistics. As part of the joint likelihood, we model the distribution of the phenotypes using a generalized linear model framework, which works for both quantitative and discrete phenotypes. Thus, the method presented here is applicable to case-control studies...... of genotype calls into account have been proposed; most require numerical optimization which for large-scale data is not always computationally feasible. We show that using a score statistic for the joint likelihood of observed phenotypes and observed sequencing data provides an attractive approach...... to association testing for next-generation sequencing data. The joint model accounts for the genotype classification uncertainty via the posterior probabilities of the genotypes given the observed sequencing data, which gives the approach higher power than methods based on called genotypes. This strategy remains...
The Bayesian statistical decision theory applied to the optimization of generating set maintenance
International Nuclear Information System (INIS)
Procaccia, H.; Cordier, R.; Muller, S.
1994-11-01
The difficulty in RCM methodology is the allocation of a new periodicity of preventive maintenance on one equipment when a critical failure has been identified: until now this new allocation has been based on the engineer's judgment, and one must wait for a full cycle of feedback experience before to validate it. Statistical decision theory could be a more rational alternative for the optimization of preventive maintenance periodicity. This methodology has been applied to inspection and maintenance optimization of cylinders of diesel generator engines of 900 MW nuclear plants, and has shown that previous preventive maintenance periodicity can be extended. (authors). 8 refs., 5 figs
International Nuclear Information System (INIS)
Boning, Duane S.; Chung, James E.
1998-01-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal
DEFF Research Database (Denmark)
Korneliussen, Thorfinn Sand; Moltke, Ida; Albrechtsen, Anders
2013-01-01
A number of different statistics are used for detecting natural selection using DNA sequencing data, including statistics that are summaries of the frequency spectrum, such as Tajima's D. These statistics are now often being applied in the analysis of Next Generation Sequencing (NGS) data. Howeve......, estimates of frequency spectra from NGS data are strongly affected by low sequencing coverage; the inherent technology dependent variation in sequencing depth causes systematic differences in the value of the statistic among genomic regions....
Steam Generator Group Project. Progress report on data acquisition/statistical analysis
International Nuclear Information System (INIS)
Doctor, P.G.; Buchanan, J.A.; McIntyre, J.M.; Hof, P.J.; Ercanbrack, S.S.
1984-01-01
A major task of the Steam Generator Group Project (SGGP) is to establish the reliability of the eddy current inservice inspections of PWR steam generator tubing, by comparing the eddy current data to the actual physical condition of the tubes via destructive analyses. This report describes the plans for the computer systems needed to acquire, store and analyze the diverse data to be collected during the project. The real-time acquisition of the baseline eddy current inspection data will be handled using a specially designed data acquisition computer system based on a Digital Equipment Corporation (DEC) PDP-11/44. The data will be archived in digital form for use after the project is completed. Data base management and statistical analyses will be done on a DEC VAX-11/780. Color graphics will be heavily used to summarize the data and the results of the analyses. The report describes the data that will be taken during the project and the statistical methods that will be used to analyze the data. 7 figures, 2 tables
Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano
Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.
2018-05-01
To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become
Energy Technology Data Exchange (ETDEWEB)
Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)
2014-07-01
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
International Nuclear Information System (INIS)
Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier
2014-01-01
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
Statistically generated weighted curve fit of residual functions for modal analysis of structures
Bookout, P. S.
1995-01-01
A statistically generated weighting function for a second-order polynomial curve fit of residual functions has been developed. The residual flexibility test method, from which a residual function is generated, is a procedure for modal testing large structures in an external constraint-free environment to measure the effects of higher order modes and interface stiffness. This test method is applicable to structures with distinct degree-of-freedom interfaces to other system components. A theoretical residual function in the displacement/force domain has the characteristics of a relatively flat line in the lower frequencies and a slight upward curvature in the higher frequency range. In the test residual function, the above-mentioned characteristics can be seen in the data, but due to the present limitations in the modal parameter evaluation (natural frequencies and mode shapes) of test data, the residual function has regions of ragged data. A second order polynomial curve fit is required to obtain the residual flexibility term. A weighting function of the data is generated by examining the variances between neighboring data points. From a weighted second-order polynomial curve fit, an accurate residual flexibility value can be obtained. The residual flexibility value and free-free modes from testing are used to improve a mathematical model of the structure. The residual flexibility modal test method is applied to a straight beam with a trunnion appendage and a space shuttle payload pallet simulator.
International Nuclear Information System (INIS)
Nelson, K.; Sokkappa, P.
2008-01-01
This report describes an approach for generating a simulated population of plausible nuclear threat radiation signatures spanning a range of variability that could be encountered by radiation detection systems. In this approach, we develop a statistical model for generating random instances of smuggled nuclear material. The model is based on physics principles and bounding cases rather than on intelligence information or actual threat device designs. For this initial stage of work, we focus on random models using fissile material and do not address scenarios using non-fissile materials. The model has several uses. It may be used as a component in a radiation detection system performance simulation to generate threat samples for injection studies. It may also be used to generate a threat population to be used for training classification algorithms. In addition, we intend to use this model to generate an unclassified 'benchmark' threat population that can be openly shared with other organizations, including vendors, for use in radiation detection systems performance studies and algorithm development and evaluation activities. We assume that a quantity of fissile material is being smuggled into the country for final assembly and that shielding may have been placed around the fissile material. In terms of radiation signature, a nuclear weapon is basically a quantity of fissile material surrounded by various layers of shielding. Thus, our model of smuggled material is expected to span the space of potential nuclear weapon signatures as well. For computational efficiency, we use a generic 1-dimensional spherical model consisting of a fissile material core surrounded by various layers of shielding. The shielding layers and their configuration are defined such that the model can represent the potential range of attenuation and scattering that might occur. The materials in each layer and the associated parameters are selected from probability distributions that span the
Simulating European wind power generation applying statistical downscaling to reanalysis data
International Nuclear Information System (INIS)
González-Aparicio, I.; Monforti, F.; Volker, P.; Zucker, A.; Careri, F.; Huld, T.; Badger, J.
2017-01-01
Highlights: •Wind speed spatial resolution highly influences calculated wind power peaks and ramps. •Reduction of wind power generation uncertainties using statistical downscaling. •Publicly available dataset of wind power generation hourly time series at NUTS2. -- Abstract: The growing share of electricity production from solar and mainly wind resources constantly increases the stochastic nature of the power system. Modelling the high share of renewable energy sources – and in particular wind power – crucially depends on the adequate representation of the intermittency and characteristics of the wind resource which is related to the accuracy of the approach in converting wind speed data into power values. One of the main factors contributing to the uncertainty in these conversion methods is the selection of the spatial resolution. Although numerical weather prediction models can simulate wind speeds at higher spatial resolution (up to 1 × 1 km) than a reanalysis (generally, ranging from about 25 km to 70 km), they require high computational resources and massive storage systems: therefore, the most common alternative is to use the reanalysis data. However, local wind features could not be captured by the use of a reanalysis technique and could be translated into misinterpretations of the wind power peaks, ramping capacities, the behaviour of power prices, as well as bidding strategies for the electricity market. This study contributes to the understanding what is captured by different wind speeds spatial resolution datasets, the importance of using high resolution data for the conversion into power and the implications in power system analyses. It is proposed a methodology to increase the spatial resolution from a reanalysis. This study presents an open access renewable generation time series dataset for the EU-28 and neighbouring countries at hourly intervals and at different geographical aggregation levels (country, bidding zone and administrative
CDFTBL: A statistical program for generating cumulative distribution functions from data
International Nuclear Information System (INIS)
Eslinger, P.W.
1991-06-01
This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs
nQuire: a statistical framework for ploidy estimation using next generation sequencing.
Weiß, Clemens L; Pais, Marina; Cano, Liliana M; Kamoun, Sophien; Burbano, Hernán A
2018-04-04
Intraspecific variation in ploidy occurs in a wide range of species including pathogenic and nonpathogenic eukaryotes such as yeasts and oomycetes. Ploidy can be inferred indirectly - without measuring DNA content - from experiments using next-generation sequencing (NGS). We present nQuire, a statistical framework that distinguishes between diploids, triploids and tetraploids using NGS. The command-line tool models the distribution of base frequencies at variable sites using a Gaussian Mixture Model, and uses maximum likelihood to select the most plausible ploidy model. nQuire handles large genomes at high coverage efficiently and uses standard input file formats. We demonstrate the utility of nQuire analyzing individual samples of the pathogenic oomycete Phytophthora infestans and the Baker's yeast Saccharomyces cerevisiae. Using these organisms we show the dependence between reliability of the ploidy assignment and sequencing depth. Additionally, we employ normalized maximized log- likelihoods generated by nQuire to ascertain ploidy level in a population of samples with ploidy heterogeneity. Using these normalized values we cluster samples in three dimensions using multivariate Gaussian mixtures. The cluster assignments retrieved from a S. cerevisiae population recovered the true ploidy level in over 96% of samples. Finally, we show that nQuire can be used regionally to identify chromosomal aneuploidies. nQuire provides a statistical framework to study organisms with intraspecific variation in ploidy. nQuire is likely to be useful in epidemiological studies of pathogens, artificial selection experiments, and for historical or ancient samples where intact nuclei are not preserved. It is implemented as a stand-alone Linux command line tool in the C programming language and is available at https://github.com/clwgg/nQuire under the MIT license.
Statistical downscaling and future scenario generation of temperatures for Pakistan Region
Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas
2015-04-01
Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.
International Nuclear Information System (INIS)
Miller, A.I.; Duffey, R.
2005-01-01
Can electricity from high-capacity nuclear reactors be blended with the variable output of wind turbines to produce electrolytic hydrogen competitively? Future energy hopes and emissions reduction scenarios place significant reliance on renewables, actually meaning largely new wind power both onshore and offshore. The opportunity exists for a synergy between high capacity factor nuclear plants and wind power using hydrogen by both as a 'currency' for use in transportation and industrial processing. But this use of hydrogen needs to be introduced soon. To be competitive with alternative sources, hydrogen produced by conventional electrolysis requires low-cost electricity (likely <2.5 Cent US/kW.h). One approach is to operate interruptibly allowing an installation to sell electricity when the grid price is high and to make hydrogen when it is low. Our previous studies have shown that this could be a cost-competitive approach with a nuclear power generator producing electricity around 3 Cent US/kW.h. Although similar unit costs are projected for wind-generated electricity, idleness of the hydrogen production (electrolysis) facility due to the variability of wind generated electricity imposes a serious cost penalty. This paper reports our latest results on the potential economics of blending electricity from nuclear and wind sources by using wind-generated power, when available, to augment the current through electrolysis equipment that is primarily nuclear-powered. A voltage penalty accompanies the higher current. A 10% increase in capital cost for electrolysis equipment enables it to accommodate the higher rate of hydrogen generation, while still being substantially cheaper than the capital cost of wind-dedicated electrolysis. Real-time data for electricity costs have been combined with real-time wind variability in our NuWind model. The variability in wind fields between sites was accommodated by assuming an average wind speed that produced an average electricity
Predicting tube repair at French nuclear steam generators using statistical modeling
Energy Technology Data Exchange (ETDEWEB)
Mathon, C., E-mail: cedric.mathon@edf.fr [EDF Generation, Basic Design Department (SEPTEN), 69628 Villeurbanne (France); Chaudhary, A. [EDF Generation, Basic Design Department (SEPTEN), 69628 Villeurbanne (France); Gay, N.; Pitner, P. [EDF Generation, Nuclear Operation Division (UNIE), Saint-Denis (France)
2014-04-01
Electricité de France (EDF) currently operates a total of 58 Nuclear Pressurized Water Reactors (PWR) which are composed of 34 units of 900 MWe, 20 units of 1300 MWe and 4 units of 1450 MWe. This report provides an overall status of SG tube bundles on the 1300 MWe units. These units are 4 loop reactors using the AREVA 68/19 type SG model which are equipped either with Alloy 600 thermally treated (TT) tubes or Alloy 690 TT tubes. As of 2011, the effective full power years of operation (EFPY) ranges from 13 to 20 and during this time, the main degradation mechanisms observed on SG tubes are primary water stress corrosion cracking (PWSCC) and wear at anti-vibration bars (AVB) level. Statistical models have been developed for each type of degradation in order to predict the growth rate and number of affected tubes. Additional plugging is also performed to prevent other degradations such as tube wear due to foreign objects or high-cycle flow-induced fatigue. The contribution of these degradation mechanisms on the rate of tube plugging is described. The results from the statistical models are then used in predicting the long-term life of the steam generators and therefore providing a useful tool toward their effective life management and possible replacement.
Im, Chang-Hwan; Park, Ji-Hye; Shim, Miseon; Chang, Won Hyuk; Kim, Yun-Hee
2012-04-01
In this study, local electric field distributions generated by transcranial direct current stimulation (tDCS) with an extracephalic reference electrode were evaluated to address extracephalic tDCS safety issues. To this aim, we generated a numerical model of an adult male human upper body and applied the 3D finite element method to electric current conduction analysis. In our simulations, the active electrode was placed over the left primary motor cortex (M1) and the reference electrode was placed at six different locations: over the right temporal lobe, on the right supraorbital region, on the right deltoid, on the left deltoid, under the chin, and on the right buccinator muscle. The maximum current density and electric field intensity values in the brainstem generated by the extracephalic reference electrodes were comparable to, or even less than, those generated by the cephalic reference electrodes. These results suggest that extracephalic reference electrodes do not lead to unwanted modulation of the brainstem cardio-respiratory and autonomic centers, as indicated by recent experimental studies. The volume energy density was concentrated at the neck area by the use of deltoid reference electrodes, but was still smaller than that around the active electrode locations. In addition, the distributions of elicited cortical electric fields demonstrated that the use of extracephalic reference electrodes might allow for the robust prediction of cortical modulations with little dependence on the reference electrode locations.
Directory of Open Access Journals (Sweden)
Michel Ghins
1998-06-01
Full Text Available Although Kuhn is much more an antirealist than a realist, the earlier and later articulations of realist and antirealist ingredients in his views merit close scrutiny. What are the constituents of the real invariant World posited by Kuhn and its relation to the mutable paradigm-related worlds? Various proposed solutions to this problem (dubbed the "new-world problem" by Ian Hacking are examined and shown to be unsatisfactory. In The Structure of Scientific Revolutions, the stable World can reasonably be taken to be made up of ordinary perceived objects, whereas in Kuhn's later works the transparadigmatic World is identified with something akin to the Kantian world-in-itself. It is argued that both proposals are beset with insuperable difficulties which render Kuhn's earlier and later versions of antirealism implausible.
Realistic Material Appearance Modelling
Czech Academy of Sciences Publication Activity Database
Haindl, Michal; Filip, Jiří; Hatka, Martin
2010-01-01
Roč. 2010, č. 81 (2010), s. 13-14 ISSN 0926-4981 R&D Projects: GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : bidirectional texture function * texture modelling Subject RIV: BD - Theory of Information http:// library .utia.cas.cz/separaty/2010/RO/haindl-realistic material appearance modelling.pdf
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems
Substorm associated radar auroral surges: a statistical study and possible generation model
Directory of Open Access Journals (Sweden)
B. A. Shand
Full Text Available Substorm-associated radar auroral surges (SARAS are a short lived (15–90 minutes and spatially localised (~5° of latitude perturbation of the plasma convection pattern observed within the auroral E-region. The understanding of such phenomena has important ramifications for the investigation of the larger scale plasma convection and ultimately the coupling of the solar wind, magnetosphere and ionosphere system. A statistical investigation is undertaken of SARAS, observed by the Sweden And Britain Radar Experiment (SABRE, in order to provide a more extensive examination of the local time occurrence and propagation characteristics of the events. The statistical analysis has determined a local time occurrence of observations between 1420 MLT and 2200 MLT with a maximum occurrence centred around 1700 MLT. The propagation velocity of the SARAS feature through the SABRE field of view was found to be predominately L-shell aligned with a velocity centred around 1750 m s^{–1} and within the range 500 m s^{–1} and 3500 m s^{–1}. This comprehensive examination of the SARAS provides the opportunity to discuss, qualitatively, a possible generation mechanism for SARAS based on a proposed model for the production of a similar phenomenon referred to as sub-auroral ion drifts (SAIDs. The results of the comparison suggests that SARAS may result from a similar geophysical mechanism to that which produces SAID events, but probably occurs at a different time in the evolution of the event.
Key words. Substorms · Auroral surges · Plasma con-vection · Sub-auroral ion drifts
International Nuclear Information System (INIS)
Procaccia, H.; Cordier, R.; Muller, S.
1994-07-01
Statistical decision theory could be a alternative for the optimization of preventive maintenance periodicity. In effect, this theory concerns the situation in which a decision maker has to make a choice between a set of reasonable decisions, and where the loss associated to a given decision depends on a probabilistic risk, called state of nature. In the case of maintenance optimization, the decisions to be analyzed are different periodicities proposed by the experts, given the observed feedback experience, the states of nature are the associated failure probabilities, and the losses are the expectations of the induced cost of maintenance and of consequences of the failures. As failure probabilities concern rare events, at the ultimate state of RCM analysis (failure of sub-component), and as expected foreseeable behaviour of equipment has to be evaluated by experts, Bayesian approach is successfully used to compute states of nature. In Bayesian decision theory, a prior distribution for failure probabilities is modeled from expert knowledge, and is combined with few stochastic information provided by feedback experience, giving a posterior distribution of failure probabilities. The optimized decision is the decision that minimizes the expected loss over the posterior distribution. This methodology has been applied to inspection and maintenance optimization of cylinders of diesel generator engines of 900 MW nuclear plants. In these plants, auxiliary electric power is supplied by 2 redundant diesel generators which are tested every 2 weeks during about 1 hour. Until now, during yearly refueling of each plant, one endoscopic inspection of diesel cylinders is performed, and every 5 operating years, all cylinders are replaced. RCM has shown that cylinder failures could be critical. So Bayesian decision theory has been applied, taking into account expert opinions, and possibility of aging when maintenance periodicity is extended. (authors). 8 refs., 5 figs., 1 tab
Automatic generation of 3D statistical shape models with optimal landmark distributions.
Heimann, T; Wolf, I; Meinzer, H-P
2007-01-01
To point out the problem of non-uniform landmark placement in statistical shape modeling, to present an improved method for generating landmarks in the 3D case and to propose an unbiased evaluation metric to determine model quality. Our approach minimizes a cost function based on the minimum description length (MDL) of the shape model to optimize landmark correspondences over the training set. In addition to the standard technique, we employ an extended remeshing method to change the landmark distribution without losing correspondences, thus ensuring a uniform distribution over all training samples. To break the dependency of the established evaluation measures generalization and specificity from the landmark distribution, we change the internal metric from landmark distance to volumetric overlap. Redistributing landmarks to an equally spaced distribution during the model construction phase improves the quality of the resulting models significantly if the shapes feature prominent bulges or other complex geometry. The distribution of landmarks on the training shapes is -- beyond the correspondence issue -- a crucial point in model construction.
Symmetries, invariants and generating functions: higher-order statistics of biased tracers
Munshi, Dipak
2018-01-01
Gravitationally collapsed objects are known to be biased tracers of an underlying density contrast. Using symmetry arguments, generalised biasing schemes have recently been developed to relate the halo density contrast δh with the underlying density contrast δ, divergence of velocity θ and their higher-order derivatives. This is done by constructing invariants such as s, t, ψ,η. We show how the generating function formalism in Eulerian standard perturbation theory (SPT) can be used to show that many of the additional terms based on extended Galilean and Lifshitz symmetry actually do not make any contribution to the higher-order statistics of biased tracers. Other terms can also be drastically simplified allowing us to write the vertices associated with δh in terms of the vertices of δ and θ, the higher-order derivatives and the bias coefficients. We also compute the cumulant correlators (CCs) for two different tracer populations. These perturbative results are valid for tree-level contributions but at an arbitrary order. We also take into account the stochastic nature bias in our analysis. Extending previous results of a local polynomial model of bias, we express the one-point cumulants Script SN and their two-point counterparts, the CCs i.e. Script Cpq, of biased tracers in terms of that of their underlying density contrast counterparts. As a by-product of our calculation we also discuss the results using approximations based on Lagrangian perturbation theory (LPT).
Chapter 3 – Phenomenology of Tsunamis: Statistical Properties from Generation to Runup
Geist, Eric L.
2015-01-01
Observations related to tsunami generation, propagation, and runup are reviewed and described in a phenomenological framework. In the three coastal regimes considered (near-field broadside, near-field oblique, and far field), the observed maximum wave amplitude is associated with different parts of the tsunami wavefield. The maximum amplitude in the near-field broadside regime is most often associated with the direct arrival from the source, whereas in the near-field oblique regime, the maximum amplitude is most often associated with the propagation of edge waves. In the far field, the maximum amplitude is most often caused by the interaction of the tsunami coda that develops during basin-wide propagation and the nearshore response, including the excitation of edge waves, shelf modes, and resonance. Statistical distributions that describe tsunami observations are also reviewed, both in terms of spatial distributions, such as coseismic slip on the fault plane and near-field runup, and temporal distributions, such as wave amplitudes in the far field. In each case, fundamental theories of tsunami physics are heuristically used to explain the observations.
International Nuclear Information System (INIS)
Ferng, Y.-M.; Fan, C.N.; Pei, B.S.; Li, H.-N.
2008-01-01
A steam generator (SG) plays a significant role not only with respect to the primary-to-secondary heat transfer but also as a fission product barrier to prevent the release of radionuclides. Tube plugging is an efficient way to avoid releasing radionuclides when SG tubes are severely degraded. However, this remedial action may cause the decrease of SG heat transfer capability, especially in transient or accident conditions. It is therefore crucial for the plant staff to understand the trend of plugged tubes for the SG operation and maintenance. Statistical methodologies are proposed in this paper to predict this trend. The accumulated numbers of SG plugged tubes versus the operation time are predicted using the Weibull and log-normal distributions, which correspond well with the plant measured data from a selected pressurized water reactor (PWR). With the help of these predictions, the accumulated number of SG plugged tubes can be reasonably extrapolated to the 40-year operation lifetime (or even longer than 40 years) of a PWR. This information can assist the plant policymakers to determine whether or when a SG must be replaced
Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy
2016-02-01
Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance
Whitley, Cameron T.; Dietz, Thomas
2018-01-01
Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…
Getting realistic; Endstation Demut
Energy Technology Data Exchange (ETDEWEB)
Meyer, J.P.
2004-01-28
The fuel cell hype of the turn of the millenium has reached its end. The industry is getting realistic. If at all, fuel cell systems for private single-family and multiple dwellings will not be available until the next decade. With a Europe-wide field test, Vaillant intends to advance the PEM technology. [German] Der Brennstoffzellen-Hype der Jahrtausendwende ist verfolgen. Die Branche uebt sich in Bescheidenheit. Die Marktreife der Systeme fuer Ein- und Mehrfamilienhaeuser wird - wenn ueberhaupt - wohl erst im naechsten Jahrzehnt erreicht sein. Vaillant will durch einen europaweiten Feldtest die Entwicklung der PEM-Technologie vorantreiben. (orig.)
Directory of Open Access Journals (Sweden)
Qian eWang
2015-04-01
Full Text Available Results from numerous linkage and association studies have greatly deepened scientists’ understanding of the genetic basis of many human diseases, yet some important questions remain unanswered. For example, although a large number of disease-associated loci have been identified from genome-wide association studies (GWAS in the past 10 years, it is challenging to interpret these results as most disease-associated markers have no clear functional roles in disease etiology, and all the identified genomic factors only explain a small portion of disease heritability. With the help of next-generation sequencing (NGS, diverse types of genomic and epigenetic variations can be detected with high accuracy. More importantly, instead of using linkage disequilibrium to detect association signals based on a set of pre-set probes, NGS allows researchers to directly study all the variants in each individual, therefore promises opportunities for identifying functional variants and a more comprehensive dissection of disease heritability. Although the current scale of NGS studies is still limited due to the high cost, the success of several recent studies suggests the great potential for applying NGS in genomic epidemiology, especially as the cost of sequencing continues to drop. In this review, we discuss several pioneer applications of NGS, summarize scientific discoveries for rare and complex diseases, and compare various study designs including targeted sequencing and whole-genome sequencing using population-based and family-based cohorts. Finally, we highlight recent advancements in statistical methods proposed for sequencing analysis, including group-based association tests, meta-analysis techniques, and annotation tools for variant prioritization.
Statistical analysis of the ratio of electric and magnetic fields in random fields generators
Serra, R.; Nijenhuis, J.
2013-01-01
In this paper we present statistical models of the ratio of random electric and magnetic fields in mode-stirred reverberation chambers. This ratio is based on the electric and magnetic field statistics derived for ideal reverberation conditions. It provides a further performance indicator for
Realistic Simulation of Rice Plant
Directory of Open Access Journals (Sweden)
Wei-long DING
2011-09-01
Full Text Available The existing research results of virtual modeling of rice plant, however, is far from perfect compared to that of other crops due to its complex structure and growth process. Techniques to visually simulate the architecture of rice plant and its growth process are presented based on the analysis of the morphological characteristics at different stages. Firstly, the simulations of geometrical shape, the bending status and the structural distortion of rice leaves are conducted. Then, by using an improved model for bending deformation, the curved patterns of panicle axis and various types of panicle branches are generated, and the spatial shape of rice panicle is therefore created. Parametric L-system is employed to generate its topological structures, and finite-state automaton is adopted to describe the development of geometrical structures. Finally, the computer visualization of three-dimensional morphologies of rice plant at both organ and individual levels is achieved. The experimental results showed that the proposed methods of modeling the three-dimensional shapes of organs and simulating the growth of rice plant are feasible and effective, and the generated three-dimensional images are realistic.
AutoBayes: A System for Generating Data Analysis Programs from Statistical Models
Fischer, Bernd; Schumann, Johann
2003-01-01
Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but dificult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis...
Statistics for PV, wind and biomass generators and their impact on distribution grid planning
Nykamp, Stefan; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria
2012-01-01
The integration of renewable energy generation leads to major challenges for distribution grid operators. When the feed-in of photovoltaic (PV), biomass and wind generators exceed significantly the local consumption, large investments are needed. To improve the knowledge on the interaction between
A Statistical Model for Hourly Large-Scale Wind and Photovoltaic Generation in New Locations
DEFF Research Database (Denmark)
Ekstrom, Jussi; Koivisto, Matti Juhani; Mellin, Ilkka
2017-01-01
The analysis of large-scale wind and photovoltaic (PV) energy generation is of vital importance in power systems where their penetration is high. This paper presents a modular methodology to assess the power generation and volatility of a system consisting of both PV plants (PVPs) and wind power...... of new PVPs and WPPs in system planning. The model is verified against hourly measured wind speed and solar irradiance data from Finland. A case study assessing the impact of the geographical distribution of the PVPs and WPPs on aggregate power generation and its variability is presented....
Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke
2007-01-01
Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,
One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...
Using student models to generate feedback in a university course on statistical sampling
Tacoma, S.G.|info:eu-repo/dai/nl/411923080; Drijvers, P.H.M.|info:eu-repo/dai/nl/074302922; Boon, P.B.J.|info:eu-repo/dai/nl/203374207
2017-01-01
Due to the complexity of the topic and a lack of individual guidance, introductory statistics courses at university are often challenging. Automated feedback might help to address this issue. In this study, we explore the use of student models to provide feedback. The research question is how
International Nuclear Information System (INIS)
Guan, Dong; Wu, Jiu Hui; Jing, Li
2015-01-01
Highlights: • A random internal morphology and structure generation-growth method, termed as the quartet structure generation set (QSGS), has been utilized based on the stochastic cluster growth theory for numerical generating the various microstructures of porous metal materials. • Effects of different parameters such as thickness and porosity on sound absorption performance of the generated structures are studied by the present method, and the obtained results are validated by an empirical model as well. • This method could be utilized to guide the design and fabrication of the sound-absorption porous metal materials. - Abstract: In this paper, a statistical method for predicting sound absorption properties of porous metal materials is presented. To reflect the stochastic distribution characteristics of the porous metal materials, a random internal morphology and structure generation-growth method, termed as the quartet structure generation set (QSGS), has been utilized based on the stochastic cluster growth theory for numerical generating the various microstructures of porous metal materials. Then by using the transfer-function approach along with the QSGS tool, we investigate the sound absorbing performance of porous metal materials with complex stochastic geometries. The statistical method has been validated by the good agreement among the numerical results for metal rubber from this method and a previous empirical model and the corresponding experimental data. Furthermore, the effects of different parameters such as thickness and porosity on sound absorption performance of the generated structures are studied by the present method, and the obtained results are validated by an empirical model as well. Therefore, the present method is a reliable and robust method for predicting the sound absorption performance of porous metal materials, and could be utilized to guide the design and fabrication of the sound-absorption porous metal materials
International Nuclear Information System (INIS)
Lee, Jae Bong; Park, Jae Hak; Kim, Hong Deok; Chung, Han Sub; Kim, Tae Ryong
2005-01-01
The growth of AVB wear in Model F steam generator tubes is predicted using the Monte Carlo Method and statistical approaches. The statistical parameters that represent the characteristics of wear growth and wear initiation are derived from In-Service Inspection (ISI) Non-Destructive Evaluation (NDE) data. Based on the statistical approaches, wear growth model are proposed and applied to predict wear distribution at the End Of Cycle (EOC). Probabilistic distributions of the number of wear flaws and maximum wear depth at EOC are obtained from the analysis. Comparing the predicted EOC wear flaw data with the known EOC data the usefulness of the proposed method is examined and satisfactory results are obtained
Energy Technology Data Exchange (ETDEWEB)
Lee, Jae Bong; Park, Jae Hak [Chungbuk National Univ., Cheongju (Korea, Republic of); Kim, Hong Deok; Chung, Han Sub; Kim, Tae Ryong [Korea Electtric Power Research Institute, Daejeon (Korea, Republic of)
2005-07-01
The growth of AVB wear in Model F steam generator tubes is predicted using the Monte Carlo Method and statistical approaches. The statistical parameters that represent the characteristics of wear growth and wear initiation are derived from In-Service Inspection (ISI) Non-Destructive Evaluation (NDE) data. Based on the statistical approaches, wear growth model are proposed and applied to predict wear distribution at the End Of Cycle (EOC). Probabilistic distributions of the number of wear flaws and maximum wear depth at EOC are obtained from the analysis. Comparing the predicted EOC wear flaw data with the known EOC data the usefulness of the proposed method is examined and satisfactory results are obtained.
Awédikian , Roy; Yannou , Bernard
2012-01-01
International audience; With the growing complexity of industrial software applications, industrials are looking for efficient and practical methods to validate the software. This paper develops a model-based statistical testing approach that automatically generates online and offline test cases for embedded software. It discusses an integrated framework that combines solutions for three major software testing research questions: (i) how to select test inputs; (ii) how to predict the expected...
Triangulating and guarding realistic polygons
Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.
2014-01-01
We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating
Simulating European wind power generation applying statistical downscaling to reanalysis data
DEFF Research Database (Denmark)
Gonzalez-Aparicio, I.; Monforti, F.; Volker, Patrick
2017-01-01
generation time series dataset for the EU-28 and neighbouring countries at hourly intervals and at different geographical aggregation levels (country, bidding zone and administrative territorial unit), for a 30 year period taking into account the wind generating fleet at the end of 2015. (C) 2017 The Authors...... and characteristics of the wind resource which is related to the accuracy of the approach in converting wind speed data into power values. One of the main factors contributing to the uncertainty in these conversion methods is the selection of the spatial resolution. Although numerical weather prediction models can...... could not be captured by the use of a reanalysis technique and could be translated into misinterpretations of the wind power peaks, ramping capacities, the behaviour of power prices, as well as bidding strategies for the electricity market. This study contributes to the understanding what is captured...
Statistical Evaluation of the Emissions Level Of CO, CO2 and HC Generated by Passenger Cars
Directory of Open Access Journals (Sweden)
Claudiu Ursu
2014-12-01
Full Text Available This paper aims to make an evaluation of differences emission level of CO, CO2 and HC generated by passenger cars in different walking regimes and times, to identify measures of reducing pollution. Was analyzed a sample of Dacia Logan passenger cars (n = 515, made during the period 2004-2007, equipped with spark ignition engines, assigned to emission standards EURO 3 (E3 and EURO4 (E4. These cars were evaluated at periodical technical inspection (ITP by two times in the two walk regimes (slow idle and accelerated idle. Using the t test for paired samples (Paired Samples T Test, the results showed that there are significant differences between emissions levels (CO, CO2, HC generated by Dacia Logan passenger cars at both assessments, and regression analysis showed that these differences are not significantly influenced by turnover differences.
Statistical properties of bidimensional patterns generated from delayed and extended maps
International Nuclear Information System (INIS)
Giacomelli, G.; Lepri, S.; Politi, A.
1995-01-01
The space-time chaotic patterns associated with a class of dynamical systems ranging from delayed to extended maps are investigated. All the systems are constructed in such a way that the corresponding two-dimensional (2D) representation is characterized by the same updating rule in the bulk. The main difference among them is the direction of the ''time'' axis in the plane. Despite the different causality relations among the various models, the resulting patterns are shown to be statistically equivalent. In particular, the Kolmogorov-Sinai entropy density assumes always the same value. Therefore, it can be considered as an absolute indicator, measuring the amount of disorder of a 2D pattern. The Kaplan-Yorke dimension density is instead rule dependent: this indicator alone cannot be used to quantify the degrees of freedom of a given pattern; one must further specify the direction of propagation in the plane
Quantum cryptography: towards realization in realistic conditions
Energy Technology Data Exchange (ETDEWEB)
Imoto, M; Koashi, M; Shimizu, K [NTT Basic Research Laboratories, 3-1 Morinosato-Wakamiya, Atsugi-shi, Kanagawa 243-01 (Japan); Huttner, B [Universite de Geneve, GAP-optique, 20, Rue de l` Ecole de Medecine CH1211, Geneve 4 (Switzerland)
1997-05-11
Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author) 15 refs., 1 fig., 1 tab.
Quantum cryptography: towards realization in realistic conditions
International Nuclear Information System (INIS)
Imoto, M.; Koashi, M.; Shimizu, K.; Huttner, B.
1997-01-01
Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author)
Kato, Takeyoshi; Sugimoto, Hiroyuki; Suzuoki, Yasuo
We established a procedure for estimating regional electricity demand and regional potential capacity of distributed generators (DGs) by using a grid square statistics data set. A photovoltaic power system (PV system) for residential use and a co-generation system (CGS) for both residential and commercial use were taken into account. As an example, the result regarding Aichi prefecture was presented in this paper. The statistical data of the number of households by family-type and the number of employees by business category for about 4000 grid-square with 1km × 1km area was used to estimate the floor space or the electricity demand distribution. The rooftop area available for installing PV systems was also estimated with the grid-square statistics data set. Considering the relation between a capacity of existing CGS and a scale-index of building where CGS is installed, the potential capacity of CGS was estimated for three business categories, i.e. hotel, hospital, store. In some regions, the potential capacity of PV systems was estimated to be about 10,000kW/km2, which corresponds to the density of the existing area with intensive installation of PV systems. Finally, we discussed the ratio of regional potential capacity of DGs to regional maximum electricity demand for deducing the appropriate capacity of DGs in the model of future electricity distribution system.
Realistic Visualization of Virtual Views
DEFF Research Database (Denmark)
Livatino, Salvatore
2005-01-01
that can be impractical and sometime impossible. In addition, the artificial nature of data often makes visualized virtual scenarios not realistic enough. Not realistic in the sense that a synthetic scene is easy to discriminate visually from a natural scene. A new field of research has consequently...... developed and received much attention in recent years: Realistic Virtual View Synthesis. The main goal is a high fidelity representation of virtual scenarios while easing modeling and physical phenomena simulation. In particular, realism is achieved by the transfer to the novel view of all the physical...... phenomena captured in the reference photographs, (i.e. the transfer of photographic-realism). An overview of most prominent approaches in realistic virtual view synthesis will be presented and briefly discussed. Applications of proposed methods to visual survey, virtual cinematography, as well as mobile...
Directory of Open Access Journals (Sweden)
B. Azzouz
2007-01-01
Full Text Available The textile fibre mixture as a multicomponent blend of variable fibres imposes regarding the proper method to predict the characteristics of the final blend. The length diagram and the fibrogram of cotton are generated. Then the length distribution, the length diagram, and the fibrogram of a blend of different categories of cotton are determined. The length distributions by weight of five different categories of cotton (Egyptian, USA (Pima, Brazilian, USA (Upland, and Uzbekistani are measured by AFIS. From these distributions, the length distribution, the length diagram, and the fibrogram by weight of four binary blends are expressed. The length parameters of these cotton blends are calculated and their variations are plotted against the mass fraction x of one component in the blend .These calculated parameters are compared to those of real blends. Finally, the selection of the optimal blends using the linear programming method, based on the hypothesis that the cotton blend parameters vary linearly in function of the components rations, is proved insufficient.
Energy Technology Data Exchange (ETDEWEB)
Li, Xuebao, E-mail: lxb08357x@ncepu.edu.cn; Cui, Xiang, E-mail: x.cui@ncepu.edu.cn; Ma, Wenzuo; Bian, Xingming; Wang, Donglai [State Key Laboratory of Alternate Electrical Power System with Renewable Energy Sources, North China Electric Power University, Beijing 102206 (China); Lu, Tiebing, E-mail: tiebinglu@ncepu.edu.cn [Beijing Key Laboratory of High Voltage and EMC, North China Electric Power University, Beijing 102206 (China); Hiziroglu, Huseyin [Department of Electrical and Computer Engineering, Kettering University, Flint, Michigan 48504 (United States)
2016-03-15
The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.
Li, Xuebao; Cui, Xiang; Lu, Tiebing; Ma, Wenzuo; Bian, Xingming; Wang, Donglai; Hiziroglu, Huseyin
2016-03-01
The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.
Bolotin, Arkady
2014-01-01
It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.
Dorfman, Kevin D
2018-02-01
The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.
Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin
2018-01-01
A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.
Realistic and efficient 2D crack simulation
Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek
2010-04-01
Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.
TMS modeling toolbox for realistic simulation.
Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong
2010-01-01
Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.
Sommer, Philipp S.; Kaplan, Jed O.
2017-10-01
While a wide range of Earth system processes occur at daily and even subdaily timescales, many global vegetation and other terrestrial dynamics models historically used monthly meteorological forcing both to reduce computational demand and because global datasets were lacking. Recently, dynamic land surface modeling has moved towards resolving daily and subdaily processes, and global datasets containing daily and subdaily meteorology have become available. These meteorological datasets, however, cover only the instrumental era of the last approximately 120 years at best, are subject to considerable uncertainty, and represent extremely large data files with associated computational costs of data input/output and file transfer. For periods before the recent past or in the future, global meteorological forcing can be provided by climate model output, but the quality of these data at high temporal resolution is low, particularly for daily precipitation frequency and amount. Here, we present GWGEN, a globally applicable statistical weather generator for the temporal downscaling of monthly climatology to daily meteorology. Our weather generator is parameterized using a global meteorological database and simulates daily values of five common variables: minimum and maximum temperature, precipitation, cloud cover, and wind speed. GWGEN is lightweight, modular, and requires a minimal set of monthly mean variables as input. The weather generator may be used in a range of applications, for example, in global vegetation, crop, soil erosion, or hydrological models. While GWGEN does not currently perform spatially autocorrelated multi-point downscaling of daily weather, this additional functionality could be implemented in future versions.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
International Nuclear Information System (INIS)
Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min
2011-01-01
Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.
International Nuclear Information System (INIS)
Hufnagel, Heike; Pennec, Xavier; Ayache, Nicholas; Ehrhardt, Jan; Handels, Heinz
2008-01-01
. The novel algorithm for building a generative statistical shape model (gSSM) does not need one-to-one point correspondences but relies solely on point correspondence probabilities for the computation of mean shape and eigenmodes. It is well-suited for shape analysis on unstructured point sets. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Beranger, R [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1962-07-01
This report describes a random pulses generator adapted to nuclear instrumentation. After a short survey on the statistical nature of electronic signals, the different ways for generating pulses with a Poisson's time-distribution are studied. The final generator built from a gaseous thyratron in a magnetic field is then described. Several tests are indicated : counting-rate stability, Pearson's criterion, distribution of time-intervals. Applications of the generator in 'whole testing' of nuclear instrumentation are then indicated for sealers, dead time measurements, time analyzers. In this application, pulse-height spectrums have been made by Poissonian sampling of a recurrent or random low-frequency signal. (author) [French] Cette etude decrit un generateur d'impulsions aleatoires et ses applications a l'instrumentation nucleaire. Apres un bref rappel sur la nature statistique des signaux en electronique nucleaire, sont passes en revue les principaux moyens d'obtenir des impulsions distribuees en temps suivant une loi de Poisson. Le generateur utilisant un thyratron a gaz dans un champ magnetique est ensuite decrit; diverses methodes de test sont appliquees (stabilite du taux de comptage, criterium de Pearson, spectre des intervalles ds temps). Les applications du generateur a l'electronique nucleaire dans le domaine des 'essais globaux' sont indiques: test des echelles de comptage et mesure des temps morts, test des analyseurs en temps apres division du taux de comptage par une puissance de deux, test des analyseurs multicanaux en amplitude. Pour cette derniere application, on a realise des spectres d'amplitudes suivant une loi connue, par echantillonnage poissonien d'un signal basse frequence recurrent ou aleatoire. (auteur)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
Progress in realistic LOCA analysis
Energy Technology Data Exchange (ETDEWEB)
Young, M Y; Bajorek, S M; Ohkawa, K [Westinghouse Electric Corporation, Pittsburgh, PA (United States)
1994-12-31
While LOCA is a complex transient to simulate, the state of art in thermal hydraulics has advanced sufficiently to allow its realistic prediction and application of advanced methods to actual reactor design as demonstrated by methodology described in this paper 6 refs, 5 figs, 3 tabs
Time management: a realistic approach.
Jackson, Valerie P
2009-06-01
Realistic time management and organization plans can improve productivity and the quality of life. However, these skills can be difficult to develop and maintain. The key elements of time management are goals, organization, delegation, and relaxation. The author addresses each of these components and provides suggestions for successful time management.
Triangulating and guarding realistic polygons
Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.
2008-01-01
We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes
Should scientific realists be platonists?
DEFF Research Database (Denmark)
Busch, Jacob; Morrison, Joe
2015-01-01
an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...
Cabell, R.; Delle Monache, L.; Alessandrini, S.; Rodriguez, L.
2015-12-01
Climate-based studies require large amounts of data in order to produce accurate and reliable results. Many of these studies have used 30-plus year data sets in order to produce stable and high-quality results, and as a result, many such data sets are available, generally in the form of global reanalyses. While the analysis of these data lead to high-fidelity results, its processing can be very computationally expensive. This computational burden prevents the utilization of these data sets for certain applications, e.g., when rapid response is needed in crisis management and disaster planning scenarios resulting from release of toxic material in the atmosphere. We have developed a methodology to reduce large climate datasets to more manageable sizes while retaining statistically similar results when used to produce ensembles of possible outcomes. We do this by employing a Self-Organizing Map (SOM) algorithm to analyze general patterns of meteorological fields over a regional domain of interest to produce a small set of "typical days" with which to generate the model ensemble. The SOM algorithm takes as input a set of vectors and generates a 2D map of representative vectors deemed most similar to the input set and to each other. Input predictors are selected that are correlated with the model output, which in our case is an Atmospheric Transport and Dispersion (T&D) model that is highly dependent on surface winds and boundary layer depth. To choose a subset of "typical days," each input day is assigned to its closest SOM map node vector and then ranked by distance. Each node vector is treated as a distribution and days are sampled from them by percentile. Using a 30-node SOM, with sampling every 20th percentile, we have been able to reduce 30 years of the Climate Forecast System Reanalysis (CFSR) data for the month of October to 150 "typical days." To estimate the skill of this approach, the "Measure of Effectiveness" (MOE) metric is used to compare area and overlap
The non-stationarity is a major concern for statistically downscaling climate change scenarios for impact assessment. This study is to evaluate whether a statistical downscaling method is fully applicable to generate daily precipitation under non-stationary conditions in a wide range of climatic zo...
Progress in realistic LOCA analysis
International Nuclear Information System (INIS)
Young, M.Y.; Bajorek, S.M.; Ohkawa, K.
2004-01-01
In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for
A Realistic Seizure Prediction Study Based on Multiclass SVM.
Direito, Bruno; Teixeira, César A; Sales, Francisco; Castelo-Branco, Miguel; Dourado, António
2017-05-01
A patient-specific algorithm, for epileptic seizure prediction, based on multiclass support-vector machines (SVM) and using multi-channel high-dimensional feature sets, is presented. The feature sets, combined with multiclass classification and post-processing schemes aim at the generation of alarms and reduced influence of false positives. This study considers 216 patients from the European Epilepsy Database, and includes 185 patients with scalp EEG recordings and 31 with intracranial data. The strategy was tested over a total of 16,729.80[Formula: see text]h of inter-ictal data, including 1206 seizures. We found an overall sensitivity of 38.47% and a false positive rate per hour of 0.20. The performance of the method achieved statistical significance in 24 patients (11% of the patients). Despite the encouraging results previously reported in specific datasets, the prospective demonstration on long-term EEG recording has been limited. Our study presents a prospective analysis of a large heterogeneous, multicentric dataset. The statistical framework based on conservative assumptions, reflects a realistic approach compared to constrained datasets, and/or in-sample evaluations. The improvement of these results, with the definition of an appropriate set of features able to improve the distinction between the pre-ictal and nonpre-ictal states, hence minimizing the effect of confounding variables, remains a key aspect.
Replicate This! Creating Individual-Level Data from Summary Statistics Using R
Morse, Brendan J.
2013-01-01
Incorporating realistic data and research examples into quantitative (e.g., statistics and research methods) courses has been widely recommended for enhancing student engagement and comprehension. One way to achieve these ends is to use a data generator to emulate the data in published research articles. "MorseGen" is a free data generator that…
Realistic rhetoric and legal decision
Directory of Open Access Journals (Sweden)
João Maurício Adeodato
2017-06-01
Full Text Available The text aims to lay the foundations of a realistic rhetoric, from the descriptive perspective of how the legal decision actually takes place, without normative considerations. Aristotle's rhetorical idealism and its later prestige reduced rhetoric to the art of persuasion, eliminating important elements of sophistry, especially with regard to legal decision. It concludes with a rhetorical perspective of judicial activism in complex societies.
Realist cinema as world cinema
Nagib, Lucia
2017-01-01
The idea that “realism” is the common denominator across the vast range of productions normally labelled as “world cinema” is widespread and seemly uncontroversial. Leaving aside oppositional binaries that define world cinema as the other of Hollywood or of classical cinema, this chapter will test the realist premise by locating it in the mode of production. It will define this mode as an ethics that engages filmmakers, at cinema’s creative peaks, with the physical and historical environment,...
International Nuclear Information System (INIS)
O'Carroll, M.
1993-01-01
The author considers models of statistical mechanics and quantum field theory (in the Euclidean formulation) which are treated using renormalization group methods and where the action is a small perturbation of a quadratic action. The author obtains multiscale formulas for the generating and correlation functions after n renormalization group transformations which bring out the relation with the nth effective action. The author derives and compares the formulas for different RGs. The formulas for correlation functions involve (1) two propagators which are determined by a sequence of approximate wave function renormalization constants and renormalization group operators associated with the decomposition into scales of the quadratic form and (2) field derivatives of the nth effective action. For the case of the block field open-quotes δ-functionclose quotes RG the formulas are especially simple and for asymptotic free theories only the derivatives at zero field are needed; the formulas have been previously used directly to obtain bounds on correlation functions using information obtained from the analysis of effective actions. The simplicity can be traced to an open-quotes orthogonality-of-scalesclose quotes property which follows from an implicit wavelet structure. Other commonly used RGs do not have the open-quotes orthogonality of scalesclose quotes property. 19 refs
Generalized Warburg impedance on realistic self-affine fractals ...
Indian Academy of Sciences (India)
Administrator
Generalized Warburg impedance on realistic self-affine fractals: Comparative study of statistically corrugated and isotropic roughness. RAJESH KUMAR and RAMA KANT. Journal of Chemical Sciences, Vol. 121, No. 5, September 2009, pp. 579–588. 1. ( ) c. L. R ω on page 582, column 2, para 2, after eq (8) should read as ...
Horvath , E.A.; Fosnight, E.A.; Klingebiel, A.A.; Moore, D.G.; Stone, J.E.; Reybold, W.U.; Petersen, G.W.
1987-01-01
A methodology has been developed to create a spatial database by referencing digital elevation, Landsat multispectral scanner data, and digitized soil premap delineations of a number of adjacent 7.5-min quadrangle areas to a 30-m Universal Transverse Mercator projection. Slope and aspect transformations are calculated from elevation data and grouped according to field office specifications. An unsupervised classification is performed on a brightness and greenness transformation of the spectral data. The resulting spectral, slope, and aspect maps of each of the 7.5-min quadrangle areas are then plotted and submitted to the field office to be incorporated into the soil premapping stages of a soil survey. A tabular database is created from spatial data by generating descriptive statistics for each data layer within each soil premap delineation. The tabular data base is then entered into a data base management system to be accessed by the field office personnel during the soil survey and to be used for subsequent resource management decisions.Large amounts of data are collected and archived during resource inventories for public land management. Often these data are stored as stacks of maps or folders in a file system in someone's office, with the maps in a variety of formats, scales, and with various standards of accuracy depending on their purpose. This system of information storage and retrieval is cumbersome at best when several categories of information are needed simultaneously for analysis or as input to resource management models. Computers now provide the resource scientist with the opportunity to design increasingly complex models that require even more categories of resource-related information, thus compounding the problem.Recently there has been much emphasis on the use of geographic information systems (GIS) as an alternative method for map data archives and as a resource management tool. Considerable effort has been devoted to the generation of tabular
Tuukka Kaidesoja on Critical Realist Transcendental Realism
Directory of Open Access Journals (Sweden)
Groff Ruth
2015-09-01
Full Text Available I argue that critical realists think pretty much what Tukka Kaidesoja says that he himself thinks, but also that Kaidesoja’s objections to the views that he attributes to critical realists are not persuasive.
Realistic Real-Time Outdoor Rendering in Augmented Reality
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480
Realistic real-time outdoor rendering in augmented reality.
Directory of Open Access Journals (Sweden)
Hoshang Kolivand
Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
Bayesian inversion using a geologically realistic and discrete model space
Jaeggli, C.; Julien, S.; Renard, P.
2017-12-01
Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.
International Nuclear Information System (INIS)
Yamaoka, Naoto; Watanabe, Wataru; Hontani, Hidekata
2010-01-01
Most of the time when we construct statistical point cloud model, we need to calculate the corresponding points. Constructed statistical model will not be the same if we use different types of method to calculate the corresponding points. This article proposes the effect to statistical model of human organ made by different types of method to calculate the corresponding points. We validated the performance of statistical model by registering a surface of an organ in a 3D medical image. We compare two methods to calculate corresponding points. The first, the 'Generalized Multi-Dimensional Scaling (GMDS)', determines the corresponding points by the shapes of two curved surfaces. The second approach, the 'Entropy-based Particle system', chooses corresponding points by calculating a number of curved surfaces statistically. By these methods we construct the statistical models and using these models we conducted registration with the medical image. For the estimation, we use non-parametric belief propagation and this method estimates not only the position of the organ but also the probability density of the organ position. We evaluate how the two different types of method that calculates corresponding points affects the statistical model by change in probability density of each points. (author)
Statistical properties of superimposed stationary spike trains.
Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan
2012-06-01
The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.
Luo, Li; Zhu, Yun; Xiong, Momiao
2012-06-01
The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.
Resolution of climate model outputs are too coarse to be used as direct inputs to impact models for assessing climate change impacts on agricultural production, water resources, and eco-system services at local or site-specific scales. Statistical downscaling approaches are usually used to bridge th...
Exophobic Quasi-Realistic Heterotic String Vacua
Assel, Benjamin; Faraggi, Alon E; Kounnas, Costas; Rizos, John
2009-01-01
We demonstrate the existence of heterotic-string vacua that are free of massless exotic fields. The need to break the non-Abelian GUT symmetries in k=1 heterotic-string models by Wilson lines, while preserving the GUT embedding of the weak-hypercharge and the GUT prediction sin^2\\theta_w(M(GUT))=3/8, necessarily implies that the models contain states with fractional electric charge. Such states are severely restricted by observations, and must be confined or sufficiently massive and diluted. We construct the first quasi-realistic heterotic-string models in which the exotic states do not appear in the massless spectrum, and only exist, as they must, in the massive spectrum. The SO(10) GUT symmetry is broken to the Pati-Salam subgroup. Our PS heterotic-string models contain adequate Higgs representations to break the GUT and electroweak symmetry, as well as colour Higgs triplets that can be used for the missing partner mechanism. By statistically sampling the space of Pati-Salam vacua we demonstrate the abundan...
Simulation of microarray data with realistic characteristics
Directory of Open Access Journals (Sweden)
Lehmussola Antti
2006-07-01
Full Text Available Abstract Background Microarray technologies have become common tools in biological research. As a result, a need for effective computational methods for data analysis has emerged. Numerous different algorithms have been proposed for analyzing the data. However, an objective evaluation of the proposed algorithms is not possible due to the lack of biological ground truth information. To overcome this fundamental problem, the use of simulated microarray data for algorithm validation has been proposed. Results We present a microarray simulation model which can be used to validate different kinds of data analysis algorithms. The proposed model is unique in the sense that it includes all the steps that affect the quality of real microarray data. These steps include the simulation of biological ground truth data, applying biological and measurement technology specific error models, and finally simulating the microarray slide manufacturing and hybridization. After all these steps are taken into account, the simulated data has realistic biological and statistical characteristics. The applicability of the proposed model is demonstrated by several examples. Conclusion The proposed microarray simulation model is modular and can be used in different kinds of applications. It includes several error models that have been proposed earlier and it can be used with different types of input data. The model can be used to simulate both spotted two-channel and oligonucleotide based single-channel microarrays. All this makes the model a valuable tool for example in validation of data analysis algorithms.
Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano
2015-04-01
The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.
Zhang, Yun; Baheti, Saurabh; Sun, Zhifu
2018-05-01
High-throughput bisulfite methylation sequencing such as reduced representation bisulfite sequencing (RRBS), Agilent SureSelect Human Methyl-Seq (Methyl-seq) or whole-genome bisulfite sequencing is commonly used for base resolution methylome research. These data are represented either by the ratio of methylated cytosine versus total coverage at a CpG site or numbers of methylated and unmethylated cytosines. Multiple statistical methods can be used to detect differentially methylated CpGs (DMCs) between conditions, and these methods are often the base for the next step of differentially methylated region identification. The ratio data have a flexibility of fitting to many linear models, but the raw count data take consideration of coverage information. There is an array of options in each datatype for DMC detection; however, it is not clear which is an optimal statistical method. In this study, we systematically evaluated four statistic methods on methylation ratio data and four methods on count-based data and compared their performances with regard to type I error control, sensitivity and specificity of DMC detection and computational resource demands using real RRBS data along with simulation. Our results show that the ratio-based tests are generally more conservative (less sensitive) than the count-based tests. However, some count-based methods have high false-positive rates and should be avoided. The beta-binomial model gives a good balance between sensitivity and specificity and is preferred method. Selection of methods in different settings, signal versus noise and sample size estimation are also discussed.
An Overview of Westinghouse Realistic Large Break LOCA Evaluation Model
Directory of Open Access Journals (Sweden)
Cesare Frepoli
2008-01-01
Full Text Available Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.
Directory of Open Access Journals (Sweden)
Konchada Pavan Kumar
2016-06-01
Full Text Available The presence of nanoparticles in heat exchangers ascertained increment in heat transfer. The present work focuses on heat transfer in a longitudinal finned tube heat exchanger. Experimentation is done on longitudinal finned tube heat exchanger with pure water as working fluid and the outcome is compared numerically using computational fluid dynamics (CFD package based on finite volume method for different flow rates. Further 0.8% volume fraction of aluminum oxide (Al2O3 nanofluid is considered on shell side. The simulated nanofluid analysis has been carried out using single phase approach in CFD by updating the user-defined functions and expressions with thermophysical properties of the selected nanofluid. These results are thereafter compared against the results obtained for pure water as shell side fluid. Entropy generated due to heat transfer and fluid flow is calculated for the nanofluid. Analysis of entropy generation is carried out using the Taguchi technique. Analysis of variance (ANOVA results show that the inlet temperature on shell side has more pronounced effect on entropy generation.
Realistically Rendering SoC Traffic Patterns with Interrupt Awareness
DEFF Research Database (Denmark)
Angiolini, Frederico; Mahadevan, Sharkar; Madsen, Jan
2005-01-01
to generate realistic test traffic. This paper presents a selection of applications using interrupt-based synchronization; a reference methodology to split such applications in execution subflows and to adjust the overall execution stream based upon hardware events; a reactive simulation device capable...... of correctly replicating such software behaviours in the MPSoC design phase. Additionally, we validate the proposed concept by showing cycle-accurate reproduction of a previously traced application flow....
A scan for models with realistic fermion mass patterns
International Nuclear Information System (INIS)
Bijnens, J.; Wetterich, C.
1986-03-01
We consider models which have no small Yukawa couplings unrelated to symmetry. This situation is generic in higher dimensional unification where Yukawa couplings are predicted to have strength similar to the gauge couplings. Generations have then to be differentiated by symmetry properties and the structure of fermion mass matrices is given in terms of quantum numbers alone. We scan possible symmetries leading to realistic mass matrices. (orig.)
Statistical Change Detection for Diagnosis of Buoyancy Element Defects on Moored Floating Vessels
DEFF Research Database (Denmark)
Blanke, Mogens; Fang, Shaoji; Galeazzi, Roberto
2012-01-01
. After residual generation, statistical change detection scheme is derived from mathematical models supported by experimental data. To experimentally verify loss of an underwater buoyancy element, an underwater line breaker is designed to create realistic replication of abrupt faults. The paper analyses...... the properties of residuals and suggests a dedicated GLRT change detector based on a vector residual. Special attention is paid to threshold selection for non ideal (non-IID) test statistics....
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten
2014-03-01
Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.
Results of recent calculations using realistic potentials
International Nuclear Information System (INIS)
Friar, J.L.
1987-01-01
Results of recent calculations for the triton using realistic potentials with strong tensor forces are reviewed, with an emphasis on progress made using the many different calculational schemes. Several test problems are suggested. 49 refs., 5 figs
Sotsialistlik realist Keskküla
1998-01-01
Londonis 1998. a. ilmunud inglise kunstikriitiku Matthew Cullerne Bowni monograafias "Socialist Realist Painting" on eesti kunstnikest Enn Põldroos, Nikolai Kormashov, Ando Keskküla, Kormashovi ja Keskküla maalide reproduktsioonid
Derkach, Andriy; Chiang, Theodore; Gong, Jiafen; Addis, Laura; Dobbins, Sara; Tomlinson, Ian; Houlston, Richard; Pal, Deb K; Strug, Lisa J
2014-08-01
Sufficiently powered case-control studies with next-generation sequence (NGS) data remain prohibitively expensive for many investigators. If feasible, a more efficient strategy would be to include publicly available sequenced controls. However, these studies can be confounded by differences in sequencing platform; alignment, single nucleotide polymorphism and variant calling algorithms; read depth; and selection thresholds. Assuming one can match cases and controls on the basis of ethnicity and other potential confounding factors, and one has access to the aligned reads in both groups, we investigate the effect of systematic differences in read depth and selection threshold when comparing allele frequencies between cases and controls. We propose a novel likelihood-based method, the robust variance score (RVS), that substitutes genotype calls by their expected values given observed sequence data. We show theoretically that the RVS eliminates read depth bias in the estimation of minor allele frequency. We also demonstrate that, using simulated and real NGS data, the RVS method controls Type I error and has comparable power to the 'gold standard' analysis with the true underlying genotypes for both common and rare variants. An RVS R script and instructions can be found at strug.research.sickkids.ca, and at https://github.com/strug-lab/RVS. lisa.strug@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Generating Realistic Dynamic Prices and Services for the Smart Grid
Pagani, G. A.; Aiello, M.
2014-01-01
The smart grid promises to change the way people manage their energy needs, to facilitate the inclusion of small-scale renewable sources, and to open the energy market to all. One of the enabling instruments is the real-time pricing of energy at the retail level: dynamic and flexible tariffs will
Realistic costs of carbon capture
Energy Technology Data Exchange (ETDEWEB)
Al Juaied, Mohammed (Harvard Univ., Cambridge, MA (US). Belfer Center for Science and International Affiaris); Whitmore, Adam (Hydrogen Energy International Ltd., Weybridge (GB))
2009-07-01
There is a growing interest in carbon capture and storage (CCS) as a means of reducing carbon dioxide (CO2) emissions. However there are substantial uncertainties about the costs of CCS. Costs for pre-combustion capture with compression (i.e. excluding costs of transport and storage and any revenue from EOR associated with storage) are examined in this discussion paper for First-of-a-Kind (FOAK) plant and for more mature technologies, or Nth-of-a-Kind plant (NOAK). For FOAK plant using solid fuels the levelised cost of electricity on a 2008 basis is approximately 10 cents/kWh higher with capture than for conventional plants (with a range of 8-12 cents/kWh). Costs of abatement are found typically to be approximately US$150/tCO2 avoided (with a range of US$120-180/tCO2 avoided). For NOAK plants the additional cost of electricity with capture is approximately 2-5 cents/kWh, with costs of the range of US$35-70/tCO2 avoided. Costs of abatement with carbon capture for other fuels and technologies are also estimated for NOAK plants. The costs of abatement are calculated with reference to conventional SCPC plant for both emissions and costs of electricity. Estimates for both FOAK and NOAK are mainly based on cost data from 2008, which was at the end of a period of sustained escalation in the costs of power generation plant and other large capital projects. There are now indications of costs falling from these levels. This may reduce the costs of abatement and costs presented here may be 'peak of the market' estimates. If general cost levels return, for example, to those prevailing in 2005 to 2006 (by which time significant cost escalation had already occurred from previous levels), then costs of capture and compression for FOAK plants are expected to be US$110/tCO2 avoided (with a range of US$90-135/tCO2 avoided). For NOAK plants costs are expected to be US$25-50/tCO2. Based on these considerations a likely representative range of costs of abatement from CCS
Realist synthesis: illustrating the method for implementation research
Directory of Open Access Journals (Sweden)
Rycroft-Malone Jo
2012-04-01
Full Text Available Abstract Background Realist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not. There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered. Methods The realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation; searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses. Results Based on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area. Conclusions Realist synthesis lends itself to the review of complex interventions because it accounts for context as well as
International Nuclear Information System (INIS)
Yan, Z.; Yu, J. H.; Holland, C.; Xu, M.; Mueller, S. H.; Tynan, G. R.
2008-01-01
The statistical properties of the turbulent Reynolds stress arising from collisional drift turbulence in a magnetized plasma column are studied and a physical picture of turbulent driven shear flow generation is discussed. The Reynolds stress peaks near the maximal density gradient region, and is governed by the turbulence amplitude and cross-phase between the turbulent radial and azimuthal velocity fields. The amplitude probability distribution function (PDF) of the turbulent Reynolds stress is non-Gaussian and positively skewed at the density gradient maximum. The turbulent ion-saturation (Isat) current PDF shows that the region where the bursty Isat events are born coincides with the positively skewed non-Gaussian Reynolds stress PDF, which suggests that the bursts of particle transport appear to be associated with bursts of momentum transport as well. At the shear layer the density fluctuation radial correlation length has a strong minimum (∼4-6 mm∼0.5C s /Ω ci , where C s is the ion acoustic speed and Ω ci is the ion gyrofrequency), while the azimuthal turbulence correlation length is nearly constant across the shear layer. The results link the behavior of the Reynolds stress, its statistical properties, generation of bursty radially going azimuthal momentum transport events, and the formation of the large-scale shear layer.
Realistic Visualization of Virtual Views and Virtual Cinema
DEFF Research Database (Denmark)
Livatino, Salvatore
2005-01-01
Realistic Virtual View Visualization is a new field of research which has received increasing attention in recent years. It is strictly related to the increased popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer...... generated characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human...... beings. Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept...
Photo-Realistic Image Synthesis and Virtual Cinematography
DEFF Research Database (Denmark)
Livatino, Salvatore
2005-01-01
Realistic Virtual View Synthesis is a new field of research that has received increasing attention in recent years. It is strictly related to the grown popularity of virtual reality and the spread of its applications, among which virtual photography and cinematography. The use of computer generated...... characters, "virtual actors", in the motion picture production increases every day. While the most known computer graphics techniques have largely been adopted successfully in nowadays fictions, it still remains very challenging to implement virtual actors which would resemble, visually, human beings....... Interestingly, film directors have been looking at the recent progress achieved by the research community in the field of realistic visualization of virtual views, and they have successfully implemented state of the art research approaches in their productions. An innovative concept is then gaining consensus...
Realistic roofs over a rectilinear polygon
Ahn, Heekap
2013-11-01
Given a simple rectilinear polygon P in the xy-plane, a roof over P is a terrain over P whose faces are supported by planes through edges of P that make a dihedral angle π/4 with the xy-plane. According to this definition, some roofs may have faces isolated from the boundary of P or even local minima, which are undesirable for several practical reasons. In this paper, we introduce realistic roofs by imposing a few additional constraints. We investigate the geometric and combinatorial properties of realistic roofs and show that the straight skeleton induces a realistic roof with maximum height and volume. We also show that the maximum possible number of distinct realistic roofs over P is ((n-4)(n-4)/4 /2⌋) when P has n vertices. We present an algorithm that enumerates a combinatorial representation of each such roof in O(1) time per roof without repetition, after O(n4) preprocessing time. We also present an O(n5)-time algorithm for computing a realistic roof with minimum height or volume. © 2013 Elsevier B.V.
Development of a realistic human airway model.
Lizal, Frantisek; Elcner, Jakub; Hopke, Philip K; Jedelsky, Jan; Jicha, Miroslav
2012-03-01
Numerous models of human lungs with various levels of idealization have been reported in the literature; consequently, results acquired using these models are difficult to compare to in vivo measurements. We have developed a set of model components based on realistic geometries, which permits the analysis of the effects of subsequent model simplification. A realistic digital upper airway geometry except for the lack of an oral cavity has been created which proved suitable both for computational fluid dynamics (CFD) simulations and for the fabrication of physical models. Subsequently, an oral cavity was added to the tracheobronchial geometry. The airway geometry including the oral cavity was adjusted to enable fabrication of a semi-realistic model. Five physical models were created based on these three digital geometries. Two optically transparent models, one with and one without the oral cavity, were constructed for flow velocity measurements, two realistic segmented models, one with and one without the oral cavity, were constructed for particle deposition measurements, and a semi-realistic model with glass cylindrical airways was developed for optical measurements of flow velocity and in situ particle size measurements. One-dimensional phase doppler anemometry measurements were made and compared to the CFD calculations for this model and good agreement was obtained.
International Nuclear Information System (INIS)
Riccardella, P.C.; Staples, J.F.; Kandra, J.T.
2009-01-01
Inspections of steam generator tubing are performed in U.S. PWRs as part of the Steam Generator Management Program. Westinghouse has recently completed a technical justification demonstrating that in steam generators with thermally treated Ni-Cr Alloy (Alloy 600TT) tubes that are hydraulically expanded into low alloy steel (SA-508) tubesheets, flaws in the region of the tubes below a certain distance from the top of the tubesheet, denoted H * , will not result in reactor coolant pressure boundary breach nor unacceptable primary-to-secondary leakage. This is because, even if a flaw in this region were to result in complete tube sever, if the length of undegraded tube in the tubesheet exceeds H*, neither operating nor accident loadings create sufficient pull-out forces to overcome the frictional forces between the tube and tubesheet. One key component of this technical justification is the differential thermal expansion between the tube and tubesheet, since a significant portion of the pullout strength of the hydraulically expanded tube-to-tubesheet joint is due to mechanical interference resulting from the larger expansion of the tubing relative to the tubesheet at a given temperature. To address this phenomenon, a detailed statistical evaluation of coefficient of thermal expansion (CTE) data for the tubesheet material (SA-508) and the tube material (thermally treated Alloy-600) was performed. Data used in the evaluation included existing test results obtained from a number of sources as well as extensive new laboratory data developed specifically for this purpose. The evaluation resulted in recommended statistical distributions of this property for the two materials including their means and probabilistic variability. In addition, it was determined that the CTE values reported in the ASME Code (Section II) represent reasonably conservative mean values for both the tubesheet and tubing material. (author)
Jathar, S. H.; Cappa, C. D.; Wexler, A. S.; Seinfeld, J. H.; Kleeman, M. J.
2016-02-01
Multi-generational oxidation of volatile organic compound (VOC) oxidation products can significantly alter the mass, chemical composition and properties of secondary organic aerosol (SOA) compared to calculations that consider only the first few generations of oxidation reactions. However, the most commonly used state-of-the-science schemes in 3-D regional or global models that account for multi-generational oxidation (1) consider only functionalization reactions but do not consider fragmentation reactions, (2) have not been constrained to experimental data and (3) are added on top of existing parameterizations. The incomplete description of multi-generational oxidation in these models has the potential to bias source apportionment and control calculations for SOA. In this work, we used the statistical oxidation model (SOM) of Cappa and Wilson (2012), constrained by experimental laboratory chamber data, to evaluate the regional implications of multi-generational oxidation considering both functionalization and fragmentation reactions. SOM was implemented into the regional University of California at Davis / California Institute of Technology (UCD/CIT) air quality model and applied to air quality episodes in California and the eastern USA. The mass, composition and properties of SOA predicted using SOM were compared to SOA predictions generated by a traditional two-product model to fully investigate the impact of explicit and self-consistent accounting of multi-generational oxidation.Results show that SOA mass concentrations predicted by the UCD/CIT-SOM model are very similar to those predicted by a two-product model when both models use parameters that are derived from the same chamber data. Since the two-product model does not explicitly resolve multi-generational oxidation reactions, this finding suggests that the chamber data used to parameterize the models captures the majority of the SOA mass formation from multi-generational oxidation under the conditions
Iterated interactions method. Realistic NN potential
International Nuclear Information System (INIS)
Gorbatov, A.M.; Skopich, V.L.; Kolganova, E.A.
1991-01-01
The method of iterated potential is tested in the case of realistic fermionic systems. As a base for comparison calculations of the 16 O system (using various versions of realistic NN potentials) by means of the angular potential-function method as well as operators of pairing correlation were used. The convergence of genealogical series is studied for the central Malfliet-Tjon potential. In addition the mathematical technique of microscopical calculations is improved: new equations for correlators in odd states are suggested and the technique of leading terms was applied for the first time to calculations of heavy p-shell nuclei in the basis of angular potential functions
Are there realistically interpretable local theories?
International Nuclear Information System (INIS)
d'Espagnat, B.
1989-01-01
Although it rests on strongly established proofs, the statement that no realistically interpretable local theory is compatible with some experimentally testable predictions of quantum mechanics seems at first sight to be incompatible with a few general ideas and clear-cut statements occurring in recent theoretical work by Griffiths, Omnes, and Ballentine and Jarrett. It is shown here that in fact none of the developments due to these authors can be considered as a realistically interpretable local theory, so that there is no valid reason for suspecting that the existing proofs of the statement in question are all flawed
A Radiosity Approach to Realistic Image Synthesis
1992-12-01
AD-A259 082 AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE SYNTHESIS THESIS Richard L. Remington Captain, USAF fl ECTE AFIT/GCE/ENG/92D...09 SJANl 1993U 93-00134 Approved for public release; distribution unlimited 93& 1! A -A- AFIT/GCE/ENG/92D-09 A RADIOSITY APPROACH TO REALISTIC IMAGE...assistance in creating the input geometry file for the AWACS aircraft interior. Without his assistance, a good model for the diffuse radiosity implementation
Ghose, Soumya; Greer, Peter B.; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A.
2017-11-01
In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most ‘similar’ to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be 0.3%+/-0.9% (mean ± standard deviation) for 39 patients. The 3D Gamma pass rate was 99.8+/-0.00 (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.
Realistic terrain visualization based on 3D virtual world technology
Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai
2010-11-01
The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic
Directory of Open Access Journals (Sweden)
Dongkyun Kim
2014-01-01
Full Text Available A novel approach for a Poisson cluster stochastic rainfall generator was validated in its ability to reproduce important rainfall and watershed response characteristics at 104 locations in the United States. The suggested novel approach, The Hybrid Model (THM, as compared to the traditional Poisson cluster rainfall modeling approaches, has an additional capability to account for the interannual variability of rainfall statistics. THM and a traditional approach of Poisson cluster rainfall model (modified Bartlett-Lewis rectangular pulse model were compared in their ability to reproduce the characteristics of extreme rainfall and watershed response variables such as runoff and peak flow. The results of the comparison indicate that THM generally outperforms the traditional approach in reproducing the distributions of peak rainfall, peak flow, and runoff volume. In addition, THM significantly outperformed the traditional approach in reproducing extreme rainfall by 2.3% to 66% and extreme flow values by 32% to 71%.
MetAssimulo:Simulation of Realistic NMR Metabolic Profiles
Directory of Open Access Journals (Sweden)
De Iorio Maria
2010-10-01
Full Text Available Abstract Background Probing the complex fusion of genetic and environmental interactions, metabolic profiling (or metabolomics/metabonomics, the study of small molecules involved in metabolic reactions, is a rapidly expanding 'omics' field. A major technique for capturing metabolite data is 1H-NMR spectroscopy and this yields highly complex profiles that require sophisticated statistical analysis methods. However, experimental data is difficult to control and expensive to obtain. Thus data simulation is a productive route to aid algorithm development. Results MetAssimulo is a MATLAB-based package that has been developed to simulate 1H-NMR spectra of complex mixtures such as metabolic profiles. Drawing data from a metabolite standard spectral database in conjunction with concentration information input by the user or constructed automatically from the Human Metabolome Database, MetAssimulo is able to create realistic metabolic profiles containing large numbers of metabolites with a range of user-defined properties. Current features include the simulation of two groups ('case' and 'control' specified by means and standard deviations of concentrations for each metabolite. The software enables addition of spectral noise with a realistic autocorrelation structure at user controllable levels. A crucial feature of the algorithm is its ability to simulate both intra- and inter-metabolite correlations, the analysis of which is fundamental to many techniques in the field. Further, MetAssimulo is able to simulate shifts in NMR peak positions that result from matrix effects such as pH differences which are often observed in metabolic NMR spectra and pose serious challenges for statistical algorithms. Conclusions No other software is currently able to simulate NMR metabolic profiles with such complexity and flexibility. This paper describes the algorithm behind MetAssimulo and demonstrates how it can be used to simulate realistic NMR metabolic profiles with
Acharya, N.; Frei, A.; Owens, E. M.; Chen, J.
2015-12-01
Watersheds located in the Catskill Mountains area, part of the eastern plateau climate region of New York, contributes about 90% of New York City's municipal water supply, serving 9 million New Yorkers with about 1.2 billion gallons of clean drinking water each day. The New York City Department of Environmental Protection has an ongoing series of studies to assess the potential impacts of climate change on the availability of high quality water in this water supply system. Recent studies identify increasing trends in total precipitation and in the frequency of extreme precipitation events in this region. The objectives of the present study are: to analyze the probabilistic structure of extreme precipitation based on historical observations: and to evaluate the abilities of stochastic weather generators (WG), statistical models that produce synthetic weather time series based on observed statistical properties at a particular location, to simulate the statistical properties of extreme precipitation events over this region. The generalized extreme value distribution (GEV) has been applied to the annual block maxima of precipitation for 60 years (1950 to 2009) observed data in order to estimate the events with return periods of 50, 75, and 100 years. These results were then used to evaluate a total of 13 WGs were : 12 parametric WGs including all combinations of three different orders of Markov chain (MC) models (1st , 2nd and 3rd) and four different probability distributions (exponential, gamma, skewed normal and mixed exponential); and one semi parametric WG based on k-nearest neighbor bootstrapping. Preliminary results suggest that three-parameter (skewed normal and mixed exponential distribution) and semi-parametric (k-nearest neighbor bootstrapping) WGs are more consistent with observations. It is also found that first order MC models perform as well as second or third order MC models.
On Realistically Attacking Tor with Website Fingerprinting
Directory of Open Access Journals (Sweden)
Wang Tao
2016-10-01
Full Text Available Website fingerprinting allows a local, passive observer monitoring a web-browsing client’s encrypted channel to determine her web activity. Previous attacks have shown that website fingerprinting could be a threat to anonymity networks such as Tor under laboratory conditions. However, there are significant differences between laboratory conditions and realistic conditions. First, in laboratory tests we collect the training data set together with the testing data set, so the training data set is fresh, but an attacker may not be able to maintain a fresh data set. Second, laboratory packet sequences correspond to a single page each, but for realistic packet sequences the split between pages is not obvious. Third, packet sequences may include background noise from other types of web traffic. These differences adversely affect website fingerprinting under realistic conditions. In this paper, we tackle these three problems to bridge the gap between laboratory and realistic conditions for website fingerprinting. We show that we can maintain a fresh training set with minimal resources. We demonstrate several classification-based techniques that allow us to split full packet sequences effectively into sequences corresponding to a single page each. We describe several new algorithms for tackling background noise. With our techniques, we are able to build the first website fingerprinting system that can operate directly on packet sequences collected in the wild.
Satellite Maps Deliver More Realistic Gaming
2013-01-01
When Redwood City, California-based Electronic Arts (EA) decided to make SSX, its latest snowboarding video game, it faced challenges in creating realistic-looking mountains. The solution was NASA's ASTER Global Digital Elevation Map, made available by the Jet Propulsion Laboratory, which EA used to create 28 real-life mountains from 9 different ranges for its award-winning game.
Realistic searches on stretched exponential networks
Indian Academy of Sciences (India)
We consider navigation or search schemes on networks which have a degree distribution of the form () ∝ exp(−). In addition, the linking probability is taken to be dependent on social distances and is governed by a parameter . The searches are realistic in the sense that not all search chains can be completed.
Blend Shape Interpolation and FACS for Realistic Avatar
Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila
2015-03-01
The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.
Statistical Compression for Climate Model Output
Hammerling, D.; Guinness, J.; Soh, Y. J.
2017-12-01
Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.
Separable expansion for realistic multichannel scattering problems
International Nuclear Information System (INIS)
Canton, L.; Cattapan, G.; Pisent, G.
1987-01-01
A new approach to the multichannel scattering problem with realistic local or nonlocal interactions is developed. By employing the negative-energy solutions of uncoupled Sturmian eigenvalue problems referring to simple auxiliary potentials, the coupling interactions appearing to the original multichannel problem are approximated by finite-rank potentials. By resorting to integral-equation tecniques the coupled-channel equations are then reduced to linear algebraic equations which can be straightforwardly solved. Compact algebraic expressions for the relevant scattering matrix elements are thus obtained. The convergence of the method is tasted in the single-channel case with realistic optical potentials. Excellent agreement is obtained with a few terms in the separable expansion for both real and absorptive interactions
Realistic Approach for Phasor Measurement Unit Placement
DEFF Research Database (Denmark)
Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul
2015-01-01
This paper presents a realistic cost-effectivemodel for optimal placement of phasor measurement units (PMUs) for complete observability of a power system considering practical cost implications. The proposed model considers hidden or otherwise unaccounted practical costs involved in PMU...... installation. Consideration of these hidden but significant and integral part of total PMU installation costs was inspired from practical experience on a real-life project. The proposedmodel focuses on the minimization of total realistic costs instead of a widely used theoretical concept of a minimal number...... of PMUs. The proposed model has been applied to IEEE 14-bus, IEEE 24-bus, IEEE 30-bus, New England 39-bus, and large power system of 300 buses and real life Danish grid. A comparison of the presented results with those reported by traditionalmethods has also been shown to justify the effectiveness...
Realistic molecular model of kerogen's nanostructure.
Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit
2016-05-01
Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.
Non realist tendencies in new Turkish cinema
Can, İclal
2016-01-01
http://hdl.handle.net/11693/29111 Thesis (M.S.): Bilkent University, Department of Communication and Design, İhsan Doğramacı Bilkent University, 2016. Includes bibliographical references (leaves 113-123). The realist tendency which had been dominant in cinema became more apparent with Italian neorealism affecting other national cinemas to a large extent. With the changing and developing socio economic and cultural dynamics, realism gradually has stopped being a natural const...
Security of quantum cryptography with realistic sources
International Nuclear Information System (INIS)
Lutkenhaus, N.
1999-01-01
The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)
Quantifying introgression risk with realistic population genetics
Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy
2012-01-01
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, rep...
Security of quantum cryptography with realistic sources
Energy Technology Data Exchange (ETDEWEB)
Lutkenhaus, N [Helsinki Institute of Physics, P.O. Box 9, 00014 Helsingin yliopisto (Finland)
1999-08-01
The interest in practical implementations of quantum key distribution is steadily growing. However, there is still a need to give a precise security statement which adapts to realistic implementation. In this paper I give the effective key rate we can obtain in a practical setting within scenario of security against individual attacks by an eavesdropper. It illustrates previous results that high losses together with detector dark counts can make secure quantum key distribution impossible. (Author)
Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan
2018-06-01
The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Novel high-fidelity realistic explosion damage simulation for urban environments
Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya
2010-04-01
Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Electron percolation in realistic models of carbon nanotube networks
International Nuclear Information System (INIS)
Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain
2015-01-01
The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models
Electron percolation in realistic models of carbon nanotube networks
Simoneau, Louis-Philippe; Villeneuve, Jérémie; Rochefort, Alain
2015-09-01
The influence of penetrable and curved carbon nanotubes (CNT) on the charge percolation in three-dimensional disordered CNT networks have been studied with Monte-Carlo simulations. By considering carbon nanotubes as solid objects but where the overlap between their electron cloud can be controlled, we observed that the structural characteristics of networks containing lower aspect ratio CNT are highly sensitive to the degree of penetration between crossed nanotubes. Following our efficient strategy to displace CNT to different positions to create more realistic statistical models, we conclude that the connectivity between objects increases with the hard-core/soft-shell radii ratio. In contrast, the presence of curved CNT in the random networks leads to an increasing percolation threshold and to a decreasing electrical conductivity at saturation. The waviness of CNT decreases the effective distance between the nanotube extremities, hence reducing their connectivity and degrading their electrical properties. We present the results of our simulation in terms of thickness of the CNT network from which simple structural parameters such as the volume fraction or the carbon nanotube density can be accurately evaluated with our more realistic models.
Moran, John L; Solomon, Patricia J
2013-05-24
Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals
Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books
Kelley, Jane E.; Darragh, Janine J.
2011-01-01
Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…
Realistic training scenario simulations and simulation techniques
Energy Technology Data Exchange (ETDEWEB)
Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.
2017-12-05
In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.
Detection and statistics of gusts
DEFF Research Database (Denmark)
Hannesdóttir, Ásta; Kelly, Mark C.; Mann, Jakob
In this project, a more realistic representation of gusts, based on statistical analysis, will account for the variability observed in real-world gusts. The gust representation will focus on temporal, spatial, and velocity scales that are relevant for modern wind turbines and which possibly affect...
Towards an agential realist concept of learning
DEFF Research Database (Denmark)
Plauborg, Helle
2018-01-01
Drawing on agential realism, this article explores how learning can be understood. An agential realist way of thinking about learning is sensitive to the complexity that characterises learning as a phenomenon. Thus, learning is seen as a dynamic and emergent phenomenon, constantly undergoing...... processes of becoming and expanding the range of components involved in such constitutive processes. With inspiration from Barad’s theorisation of spatiality, temporality and the interdependence of discourse and materiality, this article focuses on timespacemattering and material-discursivity. Concepts...
MANAJEMEN LABA: PERILAKU MANAJEMEN OPPORTUNISTIC ATAU REALISTIC ?
Directory of Open Access Journals (Sweden)
I Nyoman Wijana Asmara Putra
2011-01-01
Full Text Available Earnings management is a still attractive issue. It is often associatedwith negative behavior conducted by management for its own interest. In fact,it also has different side to be examined. There is another motivation to do so,such as to improve the company’s operation. This literature study aims toreview management motivation of doing earnings management, whetheropportunistic or realistic. What conflict that earnings management brings,what pro and cons about it, what would happen if earnings is not managed,whether the company would be better off or worse off.
Evaluating impact of clinical guidelines using a realist evaluation framework.
Reddy, Sandeep; Wakerman, John; Westhorp, Gill; Herring, Sally
2015-12-01
The Remote Primary Health Care Manuals (RPHCM) project team manages the development and publication of clinical protocols and procedures for primary care clinicians practicing in remote Australia. The Central Australian Rural Practitioners Association Standard Treatment Manual, the flagship manual of the RPHCM suite, has been evaluated for accessibility and acceptability in remote clinics three times in its 20-year history. These evaluations did not consider a theory-based framework or a programme theory, resulting in some limitations with the evaluation findings. With the RPHCM having an aim of enabling evidence-based practice in remote clinics and anecdotally reported to do so, testing this empirically for the full suite is vital for both stakeholders and future editions of the RPHCM. The project team utilized a realist evaluation framework to assess how, why and for what the RPHCM were being used by remote practitioners. A theory regarding the circumstances in which the manuals have and have not enabled evidence-based practice in the remote clinical context was tested. The project assessed this theory for all the manuals in the RPHCM suite, across government and aboriginal community-controlled clinics, in three regions of Australia. Implementing a realist evaluation framework to generate robust findings in this context has required innovation in the evaluation design and adaptation by researchers. This article captures the RPHCM team's experience in designing this evaluation. © 2015 John Wiley & Sons, Ltd.
Realistic Affective Forecasting: The Role of Personality
Hoerger, Michael; Chapman, Ben; Duberstein, Paul
2016-01-01
Affective forecasting often drives decision making. Although affective forecasting research has often focused on identifying sources of error at the event level, the present investigation draws upon the ‘realistic paradigm’ in seeking to identify factors that similarly influence predicted and actual emotions, explaining their concordance across individuals. We hypothesized that the personality traits neuroticism and extraversion would account for variation in both predicted and actual emotional reactions to a wide array of stimuli and events (football games, an election, Valentine’s Day, birthdays, happy/sad film clips, and an intrusive interview). As hypothesized, individuals who were more introverted and neurotic anticipated, correctly, that they would experience relatively more unpleasant emotional reactions, and those who were more extraverted and less neurotic anticipated, correctly, that they would experience relatively more pleasant emotional reactions. Personality explained 30% of the concordance between predicted and actual emotional reactions. Findings suggest three purported personality processes implicated in affective forecasting, highlight the importance of individual-differences research in this domain, and call for more research on realistic affective forecasts. PMID:26212463
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
From Minimal to Realistic Supersymmetric SU(5) Grand Unification
Altarelli, Guido; Masina, I; Altarelli, Guido; Feruglio, Ferruccio; Masina, Isabella
2000-01-01
We construct and discuss a "realistic" example of SUSY SU(5) GUT model, with an additional U(1) flavour symmetry, that is not plagued by the need of large fine tunings, like those associated with doublet-triplet splitting in the minimal model, and that leads to an acceptable phenomenology. This includes coupling unification with a value of alpha_s(m_Z) in much better agreement with the data than in the minimal version, an acceptable hierarchical pattern for fermion masses and mixing angles, also including neutrino masses and mixings, and a proton decay rate compatible with present limits (but the discovery of proton decay should be within reach of the next generation of experiments). In the neutrino sector the preferred solution is one with nearly maximal mixing both for atmospheric and solar neutrinos.
International Nuclear Information System (INIS)
Chugunkov, I.V.
2014-01-01
The report contains the description of an approach based on calculation of missing sets quantity, which allows to reduce memory usage needed for implementation of statistical tests. Information about estimation procedure of test statistics derived as a result of using this approach is also provided [ru
Realistic page-turning of electronic books
Fan, Chaoran; Li, Haisheng; Bai, Yannan
2014-01-01
The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.
Realistic Simulations of Coronagraphic Observations with WFIRST
Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)
2018-01-01
We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.
Operator representation for effective realistic interactions
Energy Technology Data Exchange (ETDEWEB)
Weber, Dennis; Feldmeier, Hans; Neff, Thomas [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany)
2013-07-01
We present a method to derive an operator representation from the partial wave matrix elements of effective realistic nucleon-nucleon potentials. This method allows to employ modern effective interactions, which are mostly given in matrix element representation, also in nuclear many-body methods requiring explicitly the operator representation, for example ''Fermionic Molecular Dynamics'' (FMD). We present results for the operator representation of effective interactions obtained from the Argonne V18 potential with the Uenitary Correlation Operator Method'' (UCOM) and the ''Similarity Renormalization Group'' (SRG). Moreover, the operator representation allows a better insight in the nonlocal structure of the potential: While the UCOM transformed potential only shows a quadratic momentum dependence, the momentum dependence of SRG transformed potentials is beyond such a simple polynomial form.
Level density from realistic nuclear potentials
International Nuclear Information System (INIS)
Calboreanu, A.
2006-01-01
Nuclear level density of some nuclei is calculated using a realistic set of single particle states (sps). These states are derived from the parameterization of nuclear potentials that describe the observed sps over a large number of nuclei. This approach has the advantage that one can infer level density for nuclei that are inaccessible for a direct study, but are very important in astrophysical processes such as those close to the drip lines. Level densities at high excitation energies are very sensitive to the actual set of sps. The fact that the sps spectrum is finite has extraordinary consequences upon nuclear reaction yields due to the leveling-off of the level density at extremely high excitation energies wrongly attributed so far to other nuclear effects. Single-particle level density parameter a parameter is extracted by fitting the calculated densities to the standard Bethe formula
Realistic microscopic level densities for spherical nuclei
International Nuclear Information System (INIS)
Cerf, N.
1994-01-01
Nuclear level densities play an important role in nuclear reactions such as the formation of the compound nucleus. We develop a microscopic calculation of the level density based on a combinatorial evaluation from a realistic single-particle level scheme. This calculation makes use of a fast Monte Carlo algorithm allowing us to consider large shell model spaces which could not be treated previously in combinatorial approaches. Since our model relies on a microscopic basis, it can be applied to exotic nuclei with more confidence than the commonly used semiphenomenological formuals. An exhaustive comparison of our predicted neutron s-wave resonance spacings with experimental data for a wide range of nuclei is presented
HELIOSEISMOLOGY OF A REALISTIC MAGNETOCONVECTIVE SUNSPOT SIMULATION
International Nuclear Information System (INIS)
Braun, D. C.; Birch, A. C.; Rempel, M.; Duvall, T. L. Jr.
2012-01-01
We compare helioseismic travel-time shifts measured from a realistic magnetoconvective sunspot simulation using both helioseismic holography and time-distance helioseismology, and measured from real sunspots observed with the Helioseismic and Magnetic Imager instrument on board the Solar Dynamics Observatory and the Michelson Doppler Imager instrument on board the Solar and Heliospheric Observatory. We find remarkable similarities in the travel-time shifts measured between the methodologies applied and between the simulated and real sunspots. Forward modeling of the travel-time shifts using either Born or ray approximation kernels and the sound-speed perturbations present in the simulation indicates major disagreements with the measured travel-time shifts. These findings do not substantially change with the application of a correction for the reduction of wave amplitudes in the simulated and real sunspots. Overall, our findings demonstrate the need for new methods for inferring the subsurface structure of sunspots through helioseismic inversions.
Realistic tissue visualization using photoacoustic image
Cho, Seonghee; Managuli, Ravi; Jeon, Seungwan; Kim, Jeesu; Kim, Chulhong
2018-02-01
Visualization methods are very important in biomedical imaging. As a technology that understands life, biomedical imaging has the unique advantage of providing the most intuitive information in the image. This advantage of biomedical imaging can be greatly improved by choosing a special visualization method. This is more complicated in volumetric data. Volume data has the advantage of containing 3D spatial information. Unfortunately, the data itself cannot directly represent the potential value. Because images are always displayed in 2D space, visualization is the key and creates the real value of volume data. However, image processing of 3D data requires complicated algorithms for visualization and high computational burden. Therefore, specialized algorithms and computing optimization are important issues in volume data. Photoacoustic-imaging is a unique imaging modality that can visualize the optical properties of deep tissue. Because the color of the organism is mainly determined by its light absorbing component, photoacoustic data can provide color information of tissue, which is closer to real tissue color. In this research, we developed realistic tissue visualization using acoustic-resolution photoacoustic volume data. To achieve realistic visualization, we designed specialized color transfer function, which depends on the depth of the tissue from the skin. We used direct ray casting method and processed color during computing shader parameter. In the rendering results, we succeeded in obtaining similar texture results from photoacoustic data. The surface reflected rays were visualized in white, and the reflected color from the deep tissue was visualized red like skin tissue. We also implemented the CUDA algorithm in an OpenGL environment for real-time interactive imaging.
Tang, Hui; Yu, Nan; Jia, Yongjun; Yu, Yong; Duan, Haifeng; Han, Dong; Ma, Guangming; Ren, Chenglong; He, Taiping
2018-01-01
To evaluate the image quality improvement and noise reduction in routine dose, non-enhanced chest CT imaging by using a new generation adaptive statistical iterative reconstruction (ASIR-V) in comparison with ASIR algorithm. 30 patients who underwent routine dose, non-enhanced chest CT using GE Discovery CT750HU (GE Healthcare, Waukesha, WI) were included. The scan parameters included tube voltage of 120 kVp, automatic tube current modulation to obtain a noise index of 14HU, rotation speed of 0.6 s, pitch of 1.375:1 and slice thickness of 5 mm. After scanning, all scans were reconstructed with the recommended level of 40%ASIR for comparison purpose and different percentages of ASIR-V from 10% to 100% in a 10% increment. The CT attenuation values and SD of the subcutaneous fat, back muscle and descending aorta were measured at the level of tracheal carina of all reconstructed images. The signal-to-noise ratio (SNR) was calculated with SD representing image noise. The subjective image quality was independently evaluated by two experienced radiologists. For all ASIR-V images, the objective image noise (SD) of fat, muscle and aorta decreased and SNR increased along with increasing ASIR-V percentage. The SD of 30% ASIR-V to 100% ASIR-V was significantly lower than that of 40% ASIR (p ASIR-V reconstructions had good diagnostic acceptability. However, the 50% ASIR-V to 70% ASIR-V series showed significantly superior visibility of small structures when compared with the 40% ASIR and ASIR-V of other percentages (p ASIR-V was the best series of all ASIR-V images, with a highest subjective image quality. The image sharpness was significantly decreased in images reconstructed by 80% ASIR-V and higher. In routine dose, non-enhanced chest CT, ASIR-V shows greater potential in reducing image noise and artefacts and maintaining image sharpness when compared to the recommended level of 40%ASIR algorithm. Combining both the objective and subjective evaluation of images, non
Plasticity-modulated seizure dynamics for seizure termination in realistic neuronal models
Koppert, M.M.J.; Kalitzin, S.; Lopes da Silva, F.H.; Viergever, M.A.
2011-01-01
In previous studies we showed that autonomous absence seizure generation and termination can be explained by realistic neuronal models eliciting bi-stable dynamics. In these models epileptic seizures are triggered either by external stimuli (reflex epilepsies) or by internal fluctuations. This
Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R; Young, Kenneth C; Cooke, Victoria; Wilkinson, Louise; Given-Wilson, Rosalind M; Wallis, Matthew G; Wells, Kevin
2017-04-07
A novel method has been developed for generating quasi-realistic voxel phantoms which simulate the compressed breast in mammography and digital breast tomosynthesis (DBT). The models are suitable for use in virtual clinical trials requiring realistic anatomy which use the multiple alternative forced choice (AFC) paradigm and patches from the complete breast image. The breast models are produced by extracting features of breast tissue components from DBT clinical images including skin, adipose and fibro-glandular tissue, blood vessels and Cooper's ligaments. A range of different breast models can then be generated by combining these components. Visual realism was validated using a receiver operating characteristic (ROC) study of patches from simulated images calculated using the breast models and from real patient images. Quantitative analysis was undertaken using fractal dimension and power spectrum analysis. The average areas under the ROC curves for 2D and DBT images were 0.51 ± 0.06 and 0.54 ± 0.09 demonstrating that simulated and real images were statistically indistinguishable by expert breast readers (7 observers); errors represented as one standard error of the mean. The average fractal dimensions (2D, DBT) for real and simulated images were (2.72 ± 0.01, 2.75 ± 0.01) and (2.77 ± 0.03, 2.82 ± 0.04) respectively; errors represented as one standard error of the mean. Excellent agreement was found between power spectrum curves of real and simulated images, with average β values (2D, DBT) of (3.10 ± 0.17, 3.21 ± 0.11) and (3.01 ± 0.32, 3.19 ± 0.07) respectively; errors represented as one standard error of the mean. These results demonstrate that radiological images of these breast models realistically represent the complexity of real breast structures and can be used to simulate patches from mammograms and DBT images that are indistinguishable from
Determination of Realistic Fire Scenarios in Spacecraft
Dietrich, Daniel L.; Ruff, Gary A.; Urban, David
2013-01-01
This paper expands on previous work that examined how large a fire a crew member could successfully survive and extinguish in the confines of a spacecraft. The hazards to the crew and equipment during an accidental fire include excessive pressure rise resulting in a catastrophic rupture of the vehicle skin, excessive temperatures that burn or incapacitate the crew (due to hyperthermia), carbon dioxide build-up or accumulation of other combustion products (e.g. carbon monoxide). The previous work introduced a simplified model that treated the fire primarily as a source of heat and combustion products and sink for oxygen prescribed (input to the model) based on terrestrial standards. The model further treated the spacecraft as a closed system with no capability to vent to the vacuum of space. The model in the present work extends this analysis to more realistically treat the pressure relief system(s) of the spacecraft, include more combustion products (e.g. HF) in the analysis and attempt to predict the fire spread and limiting fire size (based on knowledge of terrestrial fires and the known characteristics of microgravity fires) rather than prescribe them in the analysis. Including the characteristics of vehicle pressure relief systems has a dramatic mitigating effect by eliminating vehicle overpressure for all but very large fires and reducing average gas-phase temperatures.
Cerebral blood flow simulations in realistic geometries
Directory of Open Access Journals (Sweden)
Szopos Marcela
2012-04-01
Full Text Available The aim of this work is to perform the computation of the blood flow in all the cerebral network, obtained from medical images as angiographies. We use free finite elements codes as FreeFEM++. We first test the code on analytical solutions in simplified geometries. Then, we study the influence of boundary conditions on the flow and we finally perform first computations on realistic meshes. L’objectif est ici de simuler l’écoulement sanguin dans tout le réseau cérébral (artériel et veineux obtenu à partir d’angiographies cérébrales 3D à l’aide de logiciels d’éléments finis libres, comme FreeFEM++. Nous menons d’abord une étude détaillée des résultats sur des solutions analytiques et l’influence des conditions limites à imposer dans des géométries simplifiées avant de travailler sur les maillages réalistes.
Quantifying introgression risk with realistic population genetics.
Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy
2012-12-07
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.
Challenges and solutions for realistic room simulation
Begault, Durand R.
2002-05-01
Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.
Realistic Scheduling Mechanism for Smart Homes
Directory of Open Access Journals (Sweden)
Danish Mahmood
2016-03-01
Full Text Available In this work, we propose a Realistic Scheduling Mechanism (RSM to reduce user frustration and enhance appliance utility by classifying appliances with respective constraints and their time of use effectively. Algorithms are proposed regarding functioning of home appliances. A 24 hour time slot is divided into four logical sub-time slots, each composed of 360 min or 6 h. In these sub-time slots, only desired appliances (with respect to appliance classification are scheduled to raise appliance utility, restricting power consumption by a dynamically modelled power usage limiter that does not only take the electricity consumer into account but also the electricity supplier. Once appliance, time and power usage limiter modelling is done, we use a nature-inspired heuristic algorithm, Binary Particle Swarm Optimization (BPSO, optimally to form schedules with given constraints representing each sub-time slot. These schedules tend to achieve an equilibrium amongst appliance utility and cost effectiveness. For validation of the proposed RSM, we provide a comparative analysis amongst unscheduled electrical load usage, scheduled directly by BPSO and RSM, reflecting user comfort, which is based upon cost effectiveness and appliance utility.
Comparing Realistic Subthalamic Nucleus Neuron Models
Njap, Felix; Claussen, Jens C.; Moser, Andreas; Hofmann, Ulrich G.
2011-06-01
The mechanism of action of clinically effective electrical high frequency stimulation is still under debate. However, recent evidence points at the specific activation of GABA-ergic ion channels. Using a computational approach, we analyze temporal properties of the spike trains emitted by biologically realistic neurons of the subthalamic nucleus (STN) as a function of GABA-ergic synaptic input conductances. Our contribution is based on a model proposed by Rubin and Terman and exhibits a wide variety of different firing patterns, silent, low spiking, moderate spiking and intense spiking activity. We observed that most of the cells in our network turn to silent mode when we increase the GABAA input conductance above the threshold of 3.75 mS/cm2. On the other hand, insignificant changes in firing activity are observed when the input conductance is low or close to zero. We thus reproduce Rubin's model with vanishing synaptic conductances. To quantitatively compare spike trains from the original model with the modified model at different conductance levels, we apply four different (dis)similarity measures between them. We observe that Mahalanobis distance, Victor-Purpura metric, and Interspike Interval distribution are sensitive to different firing regimes, whereas Mutual Information seems undiscriminative for these functional changes.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Adapting realist synthesis methodology: The case of workplace harassment interventions.
Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie
2017-12-01
Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.
The construction of ``realistic'' four-dimensional strings through orbifolds
Font, A.; Ibáñez, L. E.; Quevedo, F.; Sierra, A.
1990-02-01
We discuss the construction of "realistic" lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3) × SU(2) × U(1) models as well as models incorporating a left-right SU(2) L × SU(2) R × U(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 × Z 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 × Z 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory.
The construction of 'realistic' four-dimensional strings through orbifolds
International Nuclear Information System (INIS)
Font, A.; Quevedo, F.; Sierra, A.
1990-01-01
We discuss the construction of 'realistic' lower rank 4-dimensional strings, through symmetric orbifolds with background fields. We present Z 3 three-generation SU(3)xSU(2)xU(1) models as well as models incorporating a left-right SU(2) L xSU(2) R xU(1) B-L symmetry in which proton stability is automatically guaranteed. Conformal field theory selection rules are used to find the flat directions to all orders which lead to these low-rank models and to study the relevant Yukawa couplings. A hierarchical structure of quark-lepton masses appears naturally in some models. We also present a detailed study of the structure of the Z 3 xZ 3 orbifold including the generalized GSO projection, the effect of discrete torsion and the conformal field theory Yukawa coupling selection rules. All these points are illustrated with a three-generation Z 3 xZ 3 model. We have made an effort to write a self-contained presentation in order to make this material available to non-string experts interested in the phenomenological aspects of this theory. (orig.)
A linear evolution for non-linear dynamics and correlations in realistic nuclei
International Nuclear Information System (INIS)
Levin, E.; Lublinsky, M.
2004-01-01
A new approach to high energy evolution based on a linear equation for QCD generating functional is developed. This approach opens a possibility for systematic study of correlations inside targets, and, in particular, inside realistic nuclei. Our results are presented as three new equations. The first one is a linear equation for QCD generating functional (and for scattering amplitude) that sums the 'fan' diagrams. For the amplitude this equation is equivalent to the non-linear Balitsky-Kovchegov equation. The second equation is a generalization of the Balitsky-Kovchegov non-linear equation to interactions with realistic nuclei. It includes a new correlation parameter which incorporates, in a model-dependent way, correlations inside the nuclei. The third equation is a non-linear equation for QCD generating functional (and for scattering amplitude) that in addition to the 'fan' diagrams sums the Glauber-Mueller multiple rescatterings
Mueller, Amy V; Hemond, Harold F
2016-05-18
Knowledge of ionic concentrations in natural waters is essential to understand watershed processes. Inorganic nitrogen, in the form of nitrate and ammonium ions, is a key nutrient as well as a participant in redox, acid-base, and photochemical processes of natural waters, leading to spatiotemporal patterns of ion concentrations at scales as small as meters or hours. Current options for measurement in situ are costly, relying primarily on instruments adapted from laboratory methods (e.g., colorimetric, UV absorption); free-standing and inexpensive ISE sensors for NO3(-) and NH4(+) could be attractive alternatives if interferences from other constituents were overcome. Multi-sensor arrays, coupled with appropriate non-linear signal processing, offer promise in this capacity but have not yet successfully achieved signal separation for NO3(-) and NH4(+)in situ at naturally occurring levels in unprocessed water samples. A novel signal processor, underpinned by an appropriate sensor array, is proposed that overcomes previous limitations by explicitly integrating basic chemical constraints (e.g., charge balance). This work further presents a rationalized process for the development of such in situ instrumentation for NO3(-) and NH4(+), including a statistical-modeling strategy for instrument design, training/calibration, and validation. Statistical analysis reveals that historical concentrations of major ionic constituents in natural waters across New England strongly covary and are multi-modal. This informs the design of a statistically appropriate training set, suggesting that the strong covariance of constituents across environmental samples can be exploited through appropriate signal processing mechanisms to further improve estimates of minor constituents. Two artificial neural network architectures, one expanded to incorporate knowledge of basic chemical constraints, were tested to process outputs of a multi-sensor array, trained using datasets of varying degrees of
Institute of Scientific and Technical Information of China (English)
李朝光
2014-01-01
In respects of the problems on generating the statistical tables and reports of industrial control system, this paper introduces a solution applied for the OPC Client which is developed by automatic generation system with the VB, ACCESS database and EXCELelectric .%针对工控系统报表生成问题，本文介绍了一种以VB、ACCESS数据库及EXCEL电子表格为平台进行自动报表系统开发的OPC客户端解决方案。
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.
2017-12-01
Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Percolation dans des reseaux realistes de nanostructures de carbone
Simoneau, Louis-Philippe
versatility in the choice of network components that can be simulated. The tools we have developed, grouped together in the RPH-HPN software Reseaux percolatifs hybrides - Hybrid Percolation Networks, construct random networks, detect contact between the tubes, translate the systems to equivalent electrical circuits and calculate global properties. An infinity of networks can have the same basic characteristics (size, diameter, etc.) and therefore the properties of a particular random network are not necessarily representative of the average properties of all networks. To obtain those general properties, we simulate a large number of random networks with the same basic characteristics and the average of the quantities is determined. The network constituent elements can be spheres, rods or snakes. The use of such geometries for network elements makes contact detection simple and quick, and more faithfully reproduce the form of carbon nanotubes. We closely monitor the geometrical and electrical properties of these elements through stochastic distributions of our choice. We can choose the length, diameter, orientation, chirality, tortuosity and impenetrable nature of the elements in order to properly reproduce real networks characteristics. We have considered statistical distribution functions that are rectangular, Gaussian, and Lorentzian, but all other distributions that can be expressed mathematically can also be envisioned. During the creation of a particular network, we generate the elements one by one. Each of their properties is sampled from a preselected distribution. Efficient algorithms used in various fields were adapted to our needs to manage the detection of contacts, clusters and percolation. In addition, we model more realistic contact between rigid nanotubes using an original method used to create the network that does not require a relaxation phase. Finally, we use Kirchhoff's laws to solve the equivalent electrical circuit conventionally. First, we evaluated
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
International Nuclear Information System (INIS)
Molina Lizcano Alicia; Bernal Suarez, Nestor Ricardo; Martinez Collantes, Jorge; Pabon Jose Daniel; Vega Rodriguez, Emel
2000-01-01
The technique is applied of statistical down scaling to find the relations between the variables simulated by a Climate Community Model, in its third version (CCM3) available on the closest grid points near three stations in the Guajira region in north-eastern Colombia, and the surface temperature at those stations. As training (or calibrating)period we chose the years from 1969 to 1990, while the phase of assessment was from 1991 to 1998. The method used was the canonical correlation analysis (CCA) The results were good insofar as the relations obtained approximate satisfactorily the real behaviour of the surface air temperature at the three stations
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
International Nuclear Information System (INIS)
Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin
2015-01-01
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites
Implementing enhanced recovery pathways: a literature review with realist synthesis.
Coxon, Astrid; Nielsen, Karina; Cross, Jane; Fox, Chris
2017-10-01
Enhanced Recovery Pathways (ERPs) are an increasingly popular, evidenced-based approach to surgery, designed to improve patient outcomes and reduce costs. Despite evidence demonstrating the benefits of these pathways, implementation and adherence have been inconsistent. Using realist synthesis, this review explored the current literature surrounding the implementation of ERPs in the UK. Knowledge consolidation between authors and consulting with field experts helped to guide the search strategy. Relevant medical and social science databases were searched from 2000 to 2016, as well as a general web search. A total of 17 papers were identified, including original research, reviews, case studies and guideline documents. Full texts were analysed, cross-examined, and data extracted and synthesised. Several implementation strategies were identified, including the contexts in which these operated, the subsequent mechanisms of action that were triggered, and the outcome patterns they produced. Context-Mechanism-Outcome (CMO) configurations were generated, tested, and refined. These were grouped to develop two programme theories concerning ERP implementation, one related to the strategy of consulting with staff, the other with appointing a change agent to coordinate and drive the implementation process. These theories highlight instances in which implementation could be improved. Current literature in ERP research is primarily focussed on measuring patient outcomes and cost effectiveness, and as a result, important detail regarding the implementation process is often not reported or described robustly. This review not only provides recommendations for future improvements in ERP implementation, but also highlights specific areas of focus for furthering ERP implementation research.
Nonstandard Farey sequences in a realistic diode map
International Nuclear Information System (INIS)
Perez, G.; Sinha, S.; Cerdeira, H.
1991-06-01
We study a realistic coupled map system, modelling a p - i - n diode structure. As we vary the parameter corresponding to the (scaled) external potential in the model, the dynamics goes through a flip bifurcation and then a Hopf bifurcation, and as the parameter is increased further, we find evidence of a sequence of mode locked windows embedded in the quasiperiodic motion, with periodic attractors whose winding numbers p = p/q, are given by a Farey series. The interesting thing about this Farey sequence is that it is generated between two parent attractors with p = 2/7 and 2/8, where 2/8 implies two distinct coexisting attractors with p = 1/4, and the correct series is obtained only when we use parent winding number 2/8 and not 1/4. So unlike a regular Farey tree, p and q need not be relatively prime here, p = 2 x p/2 x q is permissible, where such attractors are actually comprised of two coexisting attractors with p = p/q. We also checked that the positions and widths of these windows exhibit well defined power law scaling. When the potential is increased further, the Farey windows still provide a ''skeleton'' for the dynamics, and within each window there is a host of other interesting dynamical features, including multiple forward and reverse Feigenbaum trees. (author). 15 refs, 7 figs
Modelisation of synchrotron radiation losses in realistic tokamak plasmas
International Nuclear Information System (INIS)
Albajar, F.; Johner, J.; Granata, G.
2000-08-01
Synchrotron radiation losses become significant in the power balance of high-temperature plasmas envisaged for next step tokamaks. Due to the complexity of the exact calculation, these losses are usually roughly estimated with expressions derived from a plasma description using simplifying assumptions on the geometry, radiation absorption, and density and temperature profiles. In the present article, the complete formulation of the transport of synchrotron radiation is performed for realistic conditions of toroidal plasma geometry with elongated cross-section, using an exact method for the calculation of the absorption coefficient, and for arbitrary shapes of density and temperature profiles. The effects of toroidicity and temperature profile on synchrotron radiation losses are analyzed in detail. In particular, when the electron temperature profile is almost flat in the plasma center, as for example in ITB confinement regimes, synchrotron losses are found to be much stronger than in the case where the profile is represented by its best generalized parabolic approximation, though both cases give approximately the same thermal energy contents. Such an effect is not included in present approximate expressions. Finally, we propose a seven-variable fit for the fast calculation of synchrotron radiation losses. This fit is derived from a large database, which has been generated using a code implementing the complete formulation and optimized for massively parallel computing. (author)
Improved transcranial magnetic stimulation coil design with realistic head modeling
Crowther, Lawrence; Hadimani, Ravi; Jiles, David
2013-03-01
We are investigating Transcranial magnetic stimulation (TMS) as a noninvasive technique based on electromagnetic induction which causes stimulation of the neurons in the brain. TMS can be used as a pain-free alternative to conventional electroconvulsive therapy (ECT) which is still widely implemented for treatment of major depression. Development of improved TMS coils capable of stimulating subcortical regions could also allow TMS to replace invasive deep brain stimulation (DBS) which requires surgical implantation of electrodes in the brain. Our new designs allow new applications of the technique to be established for a variety of diagnostic and therapeutic applications of psychiatric disorders and neurological diseases. Calculation of the fields generated inside the head is vital for the use of this method for treatment. In prior work we have implemented a realistic head model, incorporating inhomogeneous tissue structures and electrical conductivities, allowing the site of neuronal activation to be accurately calculated. We will show how we utilize this model in the development of novel TMS coil designs to improve the depth of penetration and localization of stimulation produced by stimulator coils.
Towards realistic string vacua from branes at singularities
Conlon, Joseph P.; Maharana, Anshuman; Quevedo, Fernando
2009-05-01
We report on progress towards constructing string models incorporating both realistic D-brane matter content and moduli stabilisation with dynamical low-scale supersymmetry breaking. The general framework is that of local D-brane models embedded into the LARGE volume approach to moduli stabilisation. We review quiver theories on del Pezzo n (dPn) singularities including both D3 and D7 branes. We provide supersymmetric examples with three quark/lepton families and the gauge symmetries of the Standard, Left-Right Symmetric, Pati-Salam and Trinification models, without unwanted chiral exotics. We describe how the singularity structure leads to family symmetries governing the Yukawa couplings which may give mass hierarchies among the different generations. We outline how these models can be embedded into compact Calabi-Yau compactifications with LARGE volume moduli stabilisation, and state the minimal conditions for this to be possible. We study the general structure of soft supersymmetry breaking. At the singularity all leading order contributions to the soft terms (both gravity- and anomaly-mediation) vanish. We enumerate subleading contributions and estimate their magnitude. We also describe model-independent physical implications of this scenario. These include the masses of anomalous and non-anomalous U(1)'s and the generic existence of a new hyperweak force under which leptons and/or quarks could be charged. We propose that such a gauge boson could be responsible for the ghost muon anomaly recently found at the Tevatron's CDF detector.
Compiling quantum circuits to realistic hardware architectures using temporal planners
Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy
2018-04-01
To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.
A task-related and resting state realistic fMRI simulator for fMRI data validation
Hill, Jason E.; Liu, Xiangyu; Nutter, Brian; Mitra, Sunanda
2017-02-01
After more than 25 years of published functional magnetic resonance imaging (fMRI) studies, careful scrutiny reveals that most of the reported results lack fully decisive validation. The complex nature of fMRI data generation and acquisition results in unavoidable uncertainties in the true estimation and interpretation of both task-related activation maps and resting state functional connectivity networks, despite the use of various statistical data analysis methodologies. The goal of developing the proposed STANCE (Spontaneous and Task-related Activation of Neuronally Correlated Events) simulator is to generate realistic task-related and/or resting-state 4D blood oxygenation level dependent (BOLD) signals, given the experimental paradigm and scan protocol, by using digital phantoms of twenty normal brains available from BrainWeb (http://brainweb.bic.mni.mcgill.ca/brainweb/). The proposed simulator will include estimated system and modelled physiological noise as well as motion to serve as a reference to measured brain activities. In its current form, STANCE is a MATLAB toolbox with command line functions serving as an open-source add-on to SPM8 (http://www.fil.ion.ucl.ac.uk/spm/software/spm8/). The STANCE simulator has been designed in a modular framework so that the hemodynamic response (HR) and various noise models can be iteratively improved to include evolving knowledge about such models.
Reaming process improvement and control: An application of statistical engineering
DEFF Research Database (Denmark)
Müller, Pavel; Genta, G.; Barbato, G.
2012-01-01
A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications
Mueller, Matthias; Casser, Vincent; Lahoud, Jean; Smith, Neil; Ghanem, Bernard
2017-01-01
We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.
UE4Sim: A Photo-Realistic Simulator for Computer Vision Applications
Mueller, Matthias
2017-08-19
We present a photo-realistic training and evaluation simulator (UE4Sim) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network (DNN) architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.
Sim4CV: A Photo-Realistic Simulator for Computer Vision Applications
Müller, Matthias
2018-03-24
We present a photo-realistic training and evaluation simulator (Sim4CV) (http://www.sim4cv.org) with extensive applications across various fields of computer vision. Built on top of the Unreal Engine, the simulator integrates full featured physics based cars, unmanned aerial vehicles (UAVs), and animated human actors in diverse urban and suburban 3D environments. We demonstrate the versatility of the simulator with two case studies: autonomous UAV-based tracking of moving objects and autonomous driving using supervised learning. The simulator fully integrates both several state-of-the-art tracking algorithms with a benchmark evaluation tool and a deep neural network architecture for training vehicles to drive autonomously. It generates synthetic photo-realistic datasets with automatic ground truth annotations to easily extend existing real-world datasets and provides extensive synthetic data variety through its ability to reconfigure synthetic worlds on the fly using an automatic world generation tool.
International Nuclear Information System (INIS)
Kuliev, I.G.
2000-01-01
One studied the effects of the mutual carrying away of electrons and phonons on the thermomagnetic and thermoelectric phenomena in semiconductors with the degenerated statistics of current carriers. One estimated the conduction current within nonequilibrium electron-phonon system in the linear approximation on the basis of the degeneration parameter. Under the isothermal conductors the mutual carrying away was shown to affect essentially the values of the Nernst-Ettingshausen effects. One estimated the heat flow and analyzed the dependence of heat conductivity and of the Muggy-Regge (MR) effect on the magnetic field. The contribution of the mutual carrying away into the isothermal MR-effect was determined to be proportional to the degeneration parameter. One studied thermomagnetic and thermoelectric effects in the degenerated conductors with regard to the mutual carrying away of electrons and phonons both under the isothermal and under the adiabatic conditions [ru
Wei, Xiangyin; Hindle, Michael; Delvadia, Renishkumar R; Byron, Peter R
2017-10-01
The dose and aerodynamic particle size distribution (APSD) of drug aerosols' exiting models of the mouth and throat (MT) during a realistic inhalation profile (IP) may be estimated in vitro and designated Total Lung Dose, TLD in vitro , and APSD TLDin vitro , respectively. These aerosol characteristics likely define the drug's regional distribution in the lung. A general method was evaluated to enable the simultaneous determination of TLD in vitro and APSD TLDin vitro for budesonide aerosols' exiting small, medium and large VCU-MT models. Following calibration of the modified next generation pharmaceutical impactor (NGI) at 140 L/min, variations in aerosol dose and size exiting MT were determined from Budelin ® Novolizer ® across the IPs reported by Newman et al., who assessed drug deposition from this inhaler by scintigraphy. Values for TLD in vitro from the test inhaler determined by the general method were found to be statistically comparable to those using a filter capture method. Using new stage cutoffs determined by calibration of the modified NGI at 140 L/min, APSD TLDin vitro profiles and mass median aerodynamic diameters at the MT exit (MMAD TLDin vitro ) were determined as functions of MT geometric size across Newman's IPs. The range of mean values (n ≥ 5) for TLD in vitro and MMAD TLDin vitro for this inhaler extended from 6.2 to 103.0 μg (3.1%-51.5% of label claim) and from 1.7 to 3.6 μm, respectively. The method enables reliable determination of TLD in vitro and APSD TLDin vitro for aerosols likely to enter the trachea of test subjects in the clinic. By simulating realistic IPs and testing in different MT models, the effects of major variables on TLD in vitro and APSD TLDin vitro may be studied using the general method described in this study.
Toward developing more realistic groundwater models using big data
Vahdat Aboueshagh, H.; Tsai, F. T. C.; Bhatta, D.; Paudel, K.
2017-12-01
Rich geological data is the backbone of developing realistic groundwater models for groundwater resources management. However, constructing realistic groundwater models can be challenging due to inconsistency between different sources of geological, hydrogeological and geophysical data and difficulty in processing big data to characterize the subsurface environment. This study develops a framework to utilize a big geological dataset to create a groundwater model for the Chicot Aquifer in the southwestern Louisiana, which borders on the Gulf of Mexico at south. The Chicot Aquifer is the principal source of fresh water in southwest Louisiana, underlying an area of about 9,000 square miles. Agriculture is the largest groundwater consumer in this region and overpumping has caused significant groundwater head decline and saltwater intrusion from the Gulf and deep formations. A hydrostratigraphy model was constructed using around 29,000 electrical logs and drillers' logs as well as screen lengths of pumping wells through a natural neighbor interpolation method. These sources of information have different weights in terms of accuracy and trustworthy. A data prioritization procedure was developed to filter untrustworthy log information, eliminate redundant data, and establish consensus of various lithological information. The constructed hydrostratigraphy model shows 40% sand facies, which is consistent with the well log data. The hydrostratigraphy model confirms outcrop areas of the Chicot Aquifer in the north of the study region. The aquifer sand formation is thinning eastward to merge into Atchafalaya River alluvial aquifer and coalesces to the underlying Evangeline aquifer. A grid generator was used to convert the hydrostratigraphy model into a MODFLOW grid with 57 layers. A Chicot groundwater model was constructed using the available hydrologic and hydrogeological data for 2004-2015. Pumping rates for irrigation wells were estimated using the crop type and acreage
Directory of Open Access Journals (Sweden)
Kyriakos Mikelis
2015-10-01
Full Text Available Given the integration of the discipline of International Relations in Greece into the global discipline since a few decades, the article addresses the reflection of the ‘realism in and for the globe’ question to this specific case. Although the argument doesn’t go as far as to ‘recover’ forgotten IR theorists or self-proclaimed realists, a geopolitical dimension of socio-economic thought during interwar addressed concerns which could be related to the intricacies of realpolitik. Then again at current times, certain scholars have been eager to maintain a firm stance in favor of realism, focusing on the work of ancient figures, especially Thucydides or Homer, and on questions of the offensive-defensive realism debate as well as on the connection with the English School, while others have offered fruitful insights matching the broad constructivist agenda. Overall, certain genuine arguments have appeared, reflecting diversified views about sovereignty and its function or mitigation.
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
De Leonardis, Francesco; Soref, Richard A; Soltani, Mohammad; Passaro, Vittorio M N
2017-09-12
We present a physical investigation on the generation of correlated photon pairs that are broadly spaced in the ultraviolet (UV) and visible spectrum on a AlGaN/AlN integrated photonic platform which is optically transparent at these wavelengths. Using spontaneous four wave mixing (SFWM) in an AlGaN microring resonator, we show design techniques to satisfy the phase matching condition between the optical pump, the signal, and idler photon pairs, a condition which is essential and is a key hurdle when operating at short wavelength due to the strong normal dispersion of the material. Such UV-visible photon pairs are quite beneficial for interaction with qubit ions that are mostly in this wavelength range, and will enable heralding the photon-ion interaction. As a target application example, we present the systematic AlGaN microresonator design for generating signal and idler photon pairs using a blue wavelength pump, while the signal appears at the transition of ytterbium ion ( 171 Yb + , 369.5 nm) and the idler appears in the far blue or green range. The photon pairs have minimal crosstalk to the pump power due to their broad spacing in spectral wavelength, thereby relaxing the design of on-chip integrated filters for separating pump, signal and idler.
'Semi-realistic'F-term inflation model building in supergravity
International Nuclear Information System (INIS)
Kain, Ben
2008-01-01
We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models
Hewitt, Gillian; Sims, Sarah; Harris, Ruth
2014-11-01
Realist synthesis offers a novel and innovative way to interrogate the large literature on interprofessional teamwork in health and social care teams. This article introduces realist synthesis and its approach to identifying and testing the underpinning processes (or "mechanisms") that make an intervention work, the contexts that trigger those mechanisms and their subsequent outcomes. A realist synthesis of the evidence on interprofessional teamwork is described. Thirteen mechanisms were identified in the synthesis and findings for one mechanism, called "Support and value" are presented in this paper. The evidence for the other twelve mechanisms ("collaboration and coordination", "pooling of resources", "individual learning", "role blurring", "efficient, open and equitable communication", "tactical communication", "shared responsibility and influence", "team behavioural norms", "shared responsibility and influence", "critically reviewing performance and decisions", "generating and implementing new ideas" and "leadership") are reported in a further three papers in this series. The "support and value" mechanism referred to the ways in which team members supported one another, respected other's skills and abilities and valued each other's contributions. "Support and value" was present in some, but far from all, teams and a number of contexts that explained this variation were identified. The article concludes with a discussion of the challenges and benefits of undertaking this realist synthesis.
Biochemical transport modeling, estimation, and detection in realistic environments
Ortner, Mathias; Nehorai, Arye
2006-05-01
Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Realistic modeling of radiation transmission inspection systems
International Nuclear Information System (INIS)
Sale, K.E.
1993-01-01
We have applied Monte Carlo particle transport methods to assess a proposed neutron transmission inspection system for checked luggage. The geometry of the system and the time, energy and angle dependence of the source have been modeled in detail. A pulsed deuteron beam incident on a thick Be target generates a neutron pulse with a very broad energy spectrum which is detected after passage through the luggage item by a plastic scintillator detector operating in current mode (as opposed to pulse counting mode). The neutron transmission as a function of time information is used to infer the densities of hydrogen, carbon, oxygen and nitrogen in the volume sampled. The measured elemental densities can be compared to signatures for explosives or other contraband. By using such computational modeling it is possible to optimize many aspects of the design of an inspection system without costly and time consuming prototyping experiments or to determine that a proposed scheme will not work. The methods applied here can be used to evaluate neutron or photon schemes based on transmission, scattering or reaction techniques
Entrepreneurial Education: A Realistic Alternative for Women and Minorities.
Steward, James F.; Boyd, Daniel R.
1989-01-01
Entrepreneurial education is a valid, realistic occupational training alternative for minorities and women in business. Entrepreneurship requires that one become involved with those educational programs that contribute significantly to one's success. (Author)
Student Work Experience: A Realistic Approach to Merchandising Education.
Horridge, Patricia; And Others
1980-01-01
Relevant and realistic experiences are needed to prepare the student for a future career. Addresses the results of a survey of colleges and universities in the United States in regard to their student work experience (SWE) in fashion merchandising. (Author)
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Development and application of KEPRI realistic evaluation methodology (KREM) for LB-LOCA
International Nuclear Information System (INIS)
Ban, Chang-Hwan; Lee, Sang-Yong; Sung, Chang-Kyung
2004-01-01
A realistic evaluation method for LB-LOCA of a PWR, KREM, is developed and its applicability is confirmed to a 3-loop Westinghouse plant in Korea. The method uses a combined code of CONTEMPT4/MOD5 and a modified RELAP5/MOD3.1. RELAP5 code calculates system thermal hydraulics with the containment backpressure calculated by CONTEMPT4, exchanging the mass/energy release and backpressure in every time step of RELAP5. The method is developed strictly following the philosophy of CSAU with a few improvements and differences. Elements and steps of KREM are shown in Figure this paper. Three elements of CSAU are maintained and the first element has no differences. An additional step of 'Check of Experimental Data Covering (EDC)' is embedded in element 2 in order to confirm the validity of code uncertainty parameters before applying them to plant calculations. The main idea to develop the EDC is to extrapolate the code accuracy which is determined in step 8 to the uncertainties of plant calculations. EDC is described in detail elsewhere and the basic concepts are explained in the later section of this paper. KREM adopts nonparametric statistics to quantify the overall uncertainty of a LB-LOCA at 95% probability and 95% confidence level from 59 plant calculations according to Wilks formula. These 59 calculations are performed in step 12 using code parameters determined in steps 8 and 9 and operation parameters from step 11. Scale biases are also evaluated in this step using the information of step 10. Uncertainties of code models and operation conditions are reflected in 59 plant calculations as multipliers to relevant parameters in the code or as input values simply. This paper gives the explanation on the overall structures of KREM and emphasizes its unique features. In addition, its applicability is confirmed to a 3-loop plant in Korea. KREM is developed for the realistic evaluation of LB-LOCA and its applicability is successfully demonstrated for the 3-loop power plants in
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
ObamaNet: Photo-realistic lip-sync from text
Kumar, Rithesh; Sotelo, Jose; Kumar, Kundan; de Brebisson, Alexandre; Bengio, Yoshua
2017-01-01
We present ObamaNet, the first architecture that generates both audio and synchronized photo-realistic lip-sync videos from any new text. Contrary to other published lip-sync approaches, ours is only composed of fully trainable neural modules and does not rely on any traditional computer graphics methods. More precisely, we use three main modules: a text-to-speech network based on Char2Wav, a time-delayed LSTM to generate mouth-keypoints synced to the audio, and a network based on Pix2Pix to ...
International Nuclear Information System (INIS)
2013-01-01
The contribution of each fuel type to electricity generation in percentage, as of December 2011 is shown in a pie chart and the total number of reactors worldwide, as of March 2013, is shown in a table
Magnetic resonance fingerprinting based on realistic vasculature in mice.
Pouliot, Philippe; Gagnon, Louis; Lam, Tina; Avti, Pramod K; Bowen, Chris; Desjardins, Michèle; Kakkar, Ashok K; Thorin, Eric; Sakadzic, Sava; Boas, David A; Lesage, Frédéric
2017-04-01
Magnetic resonance fingerprinting (MRF) was recently proposed as a novel strategy for MR data acquisition and analysis. A variant of MRF called vascular MRF (vMRF) followed, that extracted maps of three parameters of physiological importance: cerebral oxygen saturation (SatO 2 ), mean vessel radius and cerebral blood volume (CBV). However, this estimation was based on idealized 2-dimensional simulations of vascular networks using random cylinders and the empirical Bloch equations convolved with a diffusion kernel. Here we focus on studying the vascular MR fingerprint using real mouse angiograms and physiological values as the substrate for the MR simulations. The MR signal is calculated ab initio with a Monte Carlo approximation, by tracking the accumulated phase from a large number of protons diffusing within the angiogram. We first study the identifiability of parameters in simulations, showing that parameters are fully estimable at realistically high signal-to-noise ratios (SNR) when the same angiogram is used for dictionary generation and parameter estimation, but that large biases in the estimates persist when the angiograms are different. Despite these biases, simulations show that differences in parameters remain estimable. We then applied this methodology to data acquired using the GESFIDE sequence with SPIONs injected into 9 young wild type and 9 old atherosclerotic mice. Both the pre injection signal and the ratio of post-to-pre injection signals were modeled, using 5-dimensional dictionaries. The vMRF methodology extracted significant differences in SatO 2 , mean vessel radius and CBV between the two groups, consistent across brain regions and dictionaries. Further validation work is essential before vMRF can gain wider application. Copyright © 2017 Elsevier Inc. All rights reserved.
A statistical model for porous structure of rocks
Institute of Scientific and Technical Information of China (English)
JU Yang; YANG YongMing; SONG ZhenDuo; XU WenJing
2008-01-01
The geometric features and the distribution properties of pores in rocks were In-vestigated by means of CT scanning tests of sandstones. The centroidal coordl-nares of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob-ability density functions upon which the random distribution of pore position, dis-tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex-amine the stress distribution, the pattern of element failure and the inoaculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.
A statistical model for porous structure of rocks
Institute of Scientific and Technical Information of China (English)
2008-01-01
The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
A Realistic Human Exposure Assessment of Indoor Radon released from Groundwater
International Nuclear Information System (INIS)
Yu, Dong Han; Han, Moon Hee
2002-01-01
The work presents a realistic human exposure assessment of indoor radon released from groundwater in a house. At first, a two-compartment model is developed to describe the generation and transfer of radon in indoor air from groundwater. The model is used to estimate radon concentrations profile of indoor air in a house using by showering, washing clothes, and flushing toilets. Then, the study performs an uncertainty analysis of model input parameters to quantify the uncertainty in radon concentration profile. In order to estimate a daily internal dose of a specific tissue group in an adult through the inhalation of such indoor radon, a PBPK(Physiologically-Based Pharmaco-Kinetic) model is developed. Combining indoor radon profile and PBPK model is used to a realistic human assessment for such exposure. The results obtained from this study would be used to the evaluation of human risk by inhalation associated with the indoor radon released from groundwater
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Problem Posing with Realistic Mathematics Education Approach in Geometry Learning
Mahendra, R.; Slamet, I.; Budiyono
2017-09-01
One of the difficulties of students in the learning of geometry is on the subject of plane that requires students to understand the abstract matter. The aim of this research is to determine the effect of Problem Posing learning model with Realistic Mathematics Education Approach in geometry learning. This quasi experimental research was conducted in one of the junior high schools in Karanganyar, Indonesia. The sample was taken using stratified cluster random sampling technique. The results of this research indicate that the model of Problem Posing learning with Realistic Mathematics Education Approach can improve students’ conceptual understanding significantly in geometry learning especially on plane topics. It is because students on the application of Problem Posing with Realistic Mathematics Education Approach are become to be active in constructing their knowledge, proposing, and problem solving in realistic, so it easier for students to understand concepts and solve the problems. Therefore, the model of Problem Posing learning with Realistic Mathematics Education Approach is appropriately applied in mathematics learning especially on geometry material. Furthermore, the impact can improve student achievement.
EGG: Empirical Galaxy Generator
Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; MichaÅowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.
2018-04-01
The Empirical Galaxy Generator (EGG) generates fake galaxy catalogs and images with realistic positions, morphologies and fluxes from the far-ultraviolet to the far-infrared. The catalogs are generated by egg-gencat and stored in binary FITS tables (column oriented). Another program, egg-2skymaker, is used to convert the generated catalog into ASCII tables suitable for ingestion by SkyMaker (ascl:1010.066) to produce realistic high resolution images (e.g., Hubble-like), while egg-gennoise and egg-genmap can be used to generate the low resolution images (e.g., Herschel-like). These tools can be used to test source extraction codes, or to evaluate the reliability of any map-based science (stacking, dropout identification, etc.).
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Directory of Open Access Journals (Sweden)
Maria Isabel Suero
2011-10-01
Full Text Available This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output. This new virtual environment concept, which we call hyper-realistic, transcends basic schematic simulation; it provides the user with a more realistic perception of a physical phenomenon being simulated. We compared the learning achievements of three equivalent, homogeneous groups of undergraduates—an experimental group who used only the hyper-realistic virtual laboratory, a first control group who used a schematic simulation, and a second control group who used the traditional laboratory. The three groups received the same theoretical preparation and carried out equivalent practicals in their respective learning environments. The topic chosen for the experiment was optical aberrations. An analysis of variance applied to the data of the study demonstrated a statistically significant difference (p value <0.05 between the three groups. The learning achievements attained by the group using the hyper-realistic virtual environment were 6.1 percentage points higher than those for the group using the traditional schematic simulations and 9.5 percentage points higher than those for the group using the traditional laboratory.
Dai, Quanqi; Harne, Ryan L.
2017-04-01
Effective development of vibration energy harvesters is required to convert ambient kinetic energy into useful electrical energy as power supply for sensors, for example in structural health monitoring applications. Energy harvesting structures exhibiting bistable nonlinearities have previously been shown to generate large alternating current (AC) power when excited so as to undergo snap-through responses between stable equilibria. Yet, most microelectronics in sensors require rectified voltages and hence direct current (DC) power. While researchers have studied DC power generation from bistable energy harvesters subjected to harmonic excitations, there remain important questions as to the promise of such harvester platforms when the excitations are more realistic and include both harmonic and random components. To close this knowledge gap, this research computationally and experimentally studies the DC power delivery from bistable energy harvesters subjected to such realistic excitation combinations as those found in practice. Based on the results, it is found that the ability for bistable energy harvesters to generate peak DC power is significantly reduced by introducing sufficient amount of stochastic excitations into an otherwise harmonic input. On the other hand, the elimination of a low amplitude, coexistent response regime by way of the additive noise promotes power delivery if the device was not originally excited to snap-through. The outcomes of this research indicate the necessity for comprehensive studies about the sensitivities of DC power generation from bistable energy harvester to practical excitation scenarios prior to their optimal deployment in applications.
Evaluation of photovoltaic panel temperature in realistic scenarios
International Nuclear Information System (INIS)
Du, Yanping; Fell, Christopher J.; Duck, Benjamin; Chen, Dong; Liffman, Kurt; Zhang, Yinan; Gu, Min; Zhu, Yonggang
2016-01-01
Highlights: • The developed realistic model captures more reasonably the thermal response and hysteresis effects. • The predicted panel temperature is as high as 60 °C under a solar irradiance of 1000 W/m"2 in no-wind weather. • In realistic scenarios, the thermal response normally takes 50–250 s. • The actual heating effect may cause a photoelectric efficiency drop of 2.9–9.0%. - Abstract: Photovoltaic (PV) panel temperature was evaluated by developing theoretical models that are feasible to be used in realistic scenarios. Effects of solar irradiance, wind speed and ambient temperature on the PV panel temperature were studied. The parametric study shows significant influence of solar irradiance and wind speed on the PV panel temperature. With an increase of ambient temperature, the temperature rise of solar cells is reduced. The characteristics of panel temperature in realistic scenarios were analyzed. In steady weather conditions, the thermal response time of a solar cell with a Si thickness of 100–500 μm is around 50–250 s. While in realistic scenarios, the panel temperature variation in a day is different from that in steady weather conditions due to the effect of thermal hysteresis. The heating effect on the photovoltaic efficiency was assessed based on real-time temperature measurement of solar cells in realistic weather conditions. For solar cells with a temperature coefficient in the range of −0.21%∼−0.50%, the current field tests indicated an approximate efficiency loss between 2.9% and 9.0%.
Directory of Open Access Journals (Sweden)
S. H. Jathar
2016-02-01
Full Text Available Multi-generational oxidation of volatile organic compound (VOC oxidation products can significantly alter the mass, chemical composition and properties of secondary organic aerosol (SOA compared to calculations that consider only the first few generations of oxidation reactions. However, the most commonly used state-of-the-science schemes in 3-D regional or global models that account for multi-generational oxidation (1 consider only functionalization reactions but do not consider fragmentation reactions, (2 have not been constrained to experimental data and (3 are added on top of existing parameterizations. The incomplete description of multi-generational oxidation in these models has the potential to bias source apportionment and control calculations for SOA. In this work, we used the statistical oxidation model (SOM of Cappa and Wilson (2012, constrained by experimental laboratory chamber data, to evaluate the regional implications of multi-generational oxidation considering both functionalization and fragmentation reactions. SOM was implemented into the regional University of California at Davis / California Institute of Technology (UCD/CIT air quality model and applied to air quality episodes in California and the eastern USA. The mass, composition and properties of SOA predicted using SOM were compared to SOA predictions generated by a traditional two-product model to fully investigate the impact of explicit and self-consistent accounting of multi-generational oxidation.Results show that SOA mass concentrations predicted by the UCD/CIT-SOM model are very similar to those predicted by a two-product model when both models use parameters that are derived from the same chamber data. Since the two-product model does not explicitly resolve multi-generational oxidation reactions, this finding suggests that the chamber data used to parameterize the models captures the majority of the SOA mass formation from multi-generational oxidation under
Fatigue - determination of a more realistic usage factor
International Nuclear Information System (INIS)
Lang, H.
2001-01-01
The ability to use a suitable counting method for determining the stress range spectrum in elastic and simplified elastic-plastic fatigue analyses is of crucial importance for enabling determination of a realistic usage factor. Determination of elastic-plastic strain range using the K e factor from fictitious elastically calculated loads is also important in the event of elastic behaviour being exceeded. This paper thus examines both points in detail. A fatigue module with additional options, which functions on this basis is presented. The much more realistic determination of usage factor presented here offers various economic benefits depending on the application
Putting a Realistic Theory of Mind into Agency Theory
DEFF Research Database (Denmark)
Foss, Nicolai Juul; Stea, Diego
2014-01-01
Agency theory is one of the most important foundational theories in management research, but it rests on contestable cognitive assumptions. Specifically, the principal is assumed to hold a perfect (correct) theory regarding some of the content of the agent's mind, while he is entirely ignorant...... concerning other such content. More realistically, individuals have some limited access to the minds of others. We explore the implications for classical agency theory of realistic assumptions regarding the human potential for interpersonal sensemaking. We discuss implications for the design and management...
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Fluctuations of offshore wind generation: Statistical modelling
DEFF Research Database (Denmark)
Pinson, Pierre; Christensen, Lasse E.A.; Madsen, Henrik
2007-01-01
The magnitude of power fluctuations at large offshore wind farms has a significant impact on the control and management strategies of their power output. If focusing on the minute scale, one observes successive periods with smaller and larger power fluctuations. It seems that different regimes yi...
Maccone, C.
In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in
A Statistical Primer: Understanding Descriptive and Inferential Statistics
Gillian Byrne
2007-01-01
As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...
Rescorla, Leslie A; Ghassabian, Akhgar; Ivanova, Masha Y; Jaddoe, Vincent Wv; Verhulst, Frank C; Tiemeier, Henning
2017-11-01
Although the Child Behavior Checklist 1½-5's 12-item Diagnostic and Statistical Manual of Mental Disorders-Autism Spectrum Problems Scale (formerly called Pervasive Developmental Problems scale) has been used in several studies as an autism spectrum disorder screener, the base rate and stability of its items and its measurement model have not been previously studied. We therefore examined the structure, longitudinal invariance, and stability of the Child Behavior Checklist 1½-5's Diagnostic and Statistical Manual of Mental Disorders-Autism Spectrum Problems Scale in the diverse Generation R (Rotterdam) sample based on mothers' ratings at 18 months ( n = 4695), 3 years ( n = 4571), and 5 years ( n = 5752). Five items that seemed especially characteristic of autism spectrum disorder had low base rates at all three ages. The rank order of base rates for the 12 items was highly correlated over time ( Qs ⩾ 0.86), but the longitudinal stability of individual items was modest (phi coefficients = 0.15-0.34). Confirmatory factor analyses indicated that the autism spectrum disorder scale model manifested configural, metric, and scalar longitudinal invariance over the time period from 18 months to 5 years, with large factor loadings. Correlations over time for observed autism spectrum disorder scale scores (0.25-0.50) were generally lower than the correlations across time of the latent factors (0.45-0.68). Results indicated significant associations of the autism spectrum disorder scale with later autism spectrum disorder diagnoses.
Statistical characterization of wave propagation in mine environments
Bakir, Onur
2012-07-01
A computational framework for statistically characterizing electromagnetic (EM) wave propagation through mine tunnels and galleries is presented. The framework combines a multi-element probabilistic collocation (ME-PC) method with a novel domain-decomposition (DD) integral equation-based EM simulator to obtain statistics of electric fields due to wireless transmitters in realistic mine environments. © 2012 IEEE.
A possible definition of a {\\it Realistic} Physics Theory
Gisin, Nicolas
2014-01-01
A definition of a {\\it Realistic} Physics Theory is proposed based on the idea that, at all time, the set of physical properties possessed (at that time) by a system should unequivocally determine the probabilities of outcomes of all possible measurements.
Evaluation of Highly Realistic Training for Independent Duty Corpsmen Students
2015-05-21
that he or she can perform desired actions or behaviors ( Bandura , 1977). In the present study, three types of self-efficacy were assessed: general...such as resilience. IDC Highly Realistic Training 10 REFERENCES Bandura , A (1977). Self-efficacy: Toward a unifying theory of behavioral
Using a Realist Research Methodology in Policy Analysis
Lourie, Megan; Rata, Elizabeth
2017-01-01
The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…
Automated Finger Spelling by Highly Realistic 3D Animation
Adamo-Villani, Nicoletta; Beni, Gerardo
2004-01-01
We present the design of a new 3D animation tool for self-teaching (signing and reading) finger spelling the first basic component in learning any sign language. We have designed a highly realistic hand with natural animation of the finger motions. Smoothness of motion (in real time) is achieved via programmable blending of animation segments. The…
Creating a Realistic Context for Team Projects in HCI
Koppelman, Herman; van Dijk, Betsy
2006-01-01
Team projects are nowadays common practice in HCI education. This paper focuses on the role of clients and users in team projects in introductory HCI courses. In order to provide projects with a realistic context we invite people from industry to serve as clients for the student teams. Some of them
Numerical computation of aeroacoustic transfer functions for realistic airfoils
De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto
2017-01-01
Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,
Empirical Evidence for Niss' "Implemented Anticipation" in Mathematising Realistic Situations
Stillman, Gloria; Brown, Jill P.
2012-01-01
Mathematisation of realistic situations is an on-going focus of research. Classroom data from a Year 9 class participating in a program of structured modelling of real situations was analysed for evidence of Niss's theoretical construct, implemented anticipation, during mathematisation. Evidence was found for two of three proposed aspects. In…
Nuclear properties with realistic Hamiltonians through spectral distribution theory
International Nuclear Information System (INIS)
Vary, J.P.; Belehrad, R.; Dalton, B.J.
1979-01-01
Motivated by the need of non-perturbative methods for utilizing realistic nuclear Hamiltonians H, the authors use spectral distribution theory, based on calculated moments of H, to obtain specific bulk and valence properties of finite nuclei. The primary emphasis here is to present results for the binding energies of nuclei obtained with and without an assumed core. (Auth.)
Two-Capacitor Problem: A More Realistic View.
Powell, R. A.
1979-01-01
Discusses the two-capacitor problem by considering the self-inductance of the circuit used and by determining how well the usual series RC circuit approximates the two-capacitor problem when realistic values of L, C, and R are chosen. (GA)
Rethinking Mathematics Teaching in Liberia: Realistic Mathematics Education
Stemn, Blidi S.
2017-01-01
In some African cultures, the concept of division does not necessarily mean sharing money or an item equally. How an item is shared might depend on the ages of the individuals involved. This article describes the use of the Realistic Mathematics Education (RME) approach to teach division word problems involving money in a 3rd-grade class in…
Improving Mathematics Teaching in Kindergarten with Realistic Mathematical Education
Papadakis, Stamatios; Kalogiannakis, Michail; Zaranis, Nicholas
2017-01-01
The present study investigates and compares the influence of teaching Realistic Mathematics on the development of mathematical competence in kindergarten. The sample consisted of 231 Greek kindergarten students. For the implementation of the survey, we conducted an intervention, which included one experimental and one control group. Children in…
Towards a Realist Sociology of Education: A Polyphonic Review Essay
Grenfell, Michael; Hood, Susan; Barrett, Brian D.; Schubert, Dan
2017-01-01
This review essay evaluates Karl Maton's "Knowledge and Knowers: Towards a Realist Sociology of Education" as a recent examination of the sociological causes and effects of education in the tradition of the French social theorist Pierre Bourdieu and the British educational sociologist Basil Bernstein. Maton's book synthesizes the…
Principles of maximally classical and maximally realistic quantum ...
Indian Academy of Sciences (India)
Principles of maximally classical and maximally realistic quantum mechanics. S M ROY. Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India. Abstract. Recently Auberson, Mahoux, Roy and Singh have proved a long standing conjecture of Roy and Singh: In 2N-dimensional phase space, ...
Place of a Realistic Teacher Education Pedagogy in an ICT ...
African Journals Online (AJOL)
This article is based on a study undertaken to examine the impact of introducing a realistic teacher education pedagogy (RTEP) oriented learning environment supported by ICT on distance teacher education in Uganda. It gives an overview of the quality, quantity and training of teachers in primary and secondary schools
Elements of a realistic 17 GHz FEL/TBA design
International Nuclear Information System (INIS)
Hopkins, D.B.; Halbach, K.; Hoyer, E.H.; Sessler, A.M.; Sternbach, E.J.
1989-01-01
Recently, renewed interest in an FEL version of a two-beam accelerator (TBA) has prompted a study of practical system and structure designs for achieving the specified physics goals. This paper presents elements of a realistic design for an FEL/TBA suitable for a 1 TeV, 17 GHz linear collider. 13 refs., 8 figs., 2 tabs
International Management: Creating a More Realistic Global Planning Environment.
Waldron, Darryl G.
2000-01-01
Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…
Predicting perceptual quality of images in realistic scenario using deep filter banks
Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang
2018-03-01
Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.
Statistical Engine Knock Control
DEFF Research Database (Denmark)
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Energy Technology Data Exchange (ETDEWEB)
Dohet-Eraly, Jeremy [F.R.S.-FNRS (Belgium); Sparenberg, Jean-Marc; Baye, Daniel, E-mail: jdoheter@ulb.ac.be, E-mail: jmspar@ulb.ac.be, E-mail: dbaye@ulb.ac.be [Physique Nucleaire et Physique Quantique, CP229, Universite Libre de Bruxelles (ULB), B-1050 Brussels (Belgium)
2011-09-16
The elastic phase shifts for the {alpha} + {alpha} and {alpha} + {sup 3}He collisions are calculated in a cluster approach by the Generator Coordinate Method coupled with the Microscopic R-matrix Method. Two interactions are derived from the realistic Argonne potentials AV8' and AV18 with the Unitary Correlation Operator Method. With a specific adjustment of correlations on the {alpha} + {alpha} collision, the phase shifts for the {alpha} + {alpha} and {alpha} + {sup 3}He collisions agree rather well with experimental data.
Energy statistics: Fourth quarter, 1989
International Nuclear Information System (INIS)
Anon.
1989-01-01
This volume contains 100 tables compiling data into the following broad categories: energy, drilling, natural gas, gas liquids, oil, coal, peat, electricity, uranium, and business indicators. The types of data that are given include production and consumption statistics, reserves, imports and exports, prices, fossil fuel and nuclear power generation statistics, and price indices
Radiative neutron capture: Hauser Feshbach vs. statistical resonances
Energy Technology Data Exchange (ETDEWEB)
Rochman, D., E-mail: dimitri-alexandre.rochman@psi.ch [Reactor Physics and Systems Behavior Laboratory, Paul Scherrer Institute, Villigen (Switzerland); Goriely, S. [Institut d' Astronomie et d' Astrophysique, CP-226, Université Libre de Bruxelles, 1050 Brussels (Belgium); Koning, A.J. [Nuclear Data Section, IAEA, Vienna (Austria); Uppsala University, Uppsala (Sweden); Ferroukhi, H. [Reactor Physics and Systems Behavior Laboratory, Paul Scherrer Institute, Villigen (Switzerland)
2017-01-10
The radiative neutron capture rates for isotopes of astrophysical interest are commonly calculated on the basis of the statistical Hauser Feshbach (HF) reaction model, leading to smooth and monotonically varying temperature-dependent Maxwellian-averaged cross sections (MACS). The HF approximation is known to be valid if the number of resonances in the compound system is relatively high. However, such a condition is hardly fulfilled for keV neutrons captured on light or exotic neutron-rich nuclei. For this reason, a different procedure is proposed here, based on the generation of statistical resonances. This novel technique, called the “High Fidelity Resonance” (HFR) method is shown to provide similar results as the HF approach for nuclei with a high level density but to deviate and be more realistic than HF predictions for light and neutron-rich nuclei or at relatively low sub-keV energies. The MACS derived with the HFR method are systematically compared with the traditional HF calculations for some 3300 neutron-rich nuclei and shown to give rise to significantly larger predictions with respect to the HF approach at energies of astrophysical relevance. For this reason, the HF approach should not be applied to light or neutron-rich nuclei. The Doppler broadening of the generated resonances is also studied and found to have a negligible impact on the calculated MACS.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Directory of Open Access Journals (Sweden)
Kastriot Dallaku
2016-12-01
Full Text Available Background. Postpartum haemorrhage (PPH is a potentially life-threatening complication for women, and the leading cause of maternal mortality. Tranexamic acid (TXA is an antifibrinolytic used worldwide to treat uterine haemorrhage and to reduce blood loss in general surgery. TXA may have effects on thrombin generation, platelet function and coagulation factors as a result of its inhibition on the plasmin. Methods. WOMAN ETAPlaT is a sub-study of the World Maternal Antifibrinolitic trial (WOMAN trial. All adult women clinically diagnosed with PPH after a vaginal delivery or caesarean section, are eligible for inclusion in the study. Blood samples will be collected at the baseline and 30 minutes after the first dose of study treatment is given. Platelet function will be evaluated in whole blood immediately after sampling with Multiplate® tests (ADPtest and TRAPtest. Thrombin generation, fibrinogen, D-dimer, and coagulation factors vW, V and VIII will be analysed using platelet poor plasma. Results. Recruitment to WOMAN ETAPlaT started on 04 November 2013 and closed on 13 January 2015, during this time 188 patients were recruited. The final participant follow-up was completed on 04 March 2015. This article introduces the statistical analysis plan for the study, without reference to unblinded data. Conclusion. The data from this study will provide evidence for the effect of TXA on thrombin generation, platelet function and coagulation factors in women with PPH. Trial registration: ClinicalTrials.gov Identifier: NCT00872469; ISRCTN76912190
Directory of Open Access Journals (Sweden)
Kastriot Dallaku
2017-06-01
Full Text Available Background. Postpartum haemorrhage (PPH is a potentially life-threatening complication for women, and the leading cause of maternal mortality. Tranexamic acid (TXA is an antifibrinolytic used worldwide to treat uterine haemorrhage and to reduce blood loss in general surgery. TXA may have effects on thrombin generation, platelet function and coagulation factors as a result of its inhibition on the plasmin. Methods. WOMAN ETAPlaT is a sub-study of the World Maternal Antifibrinolitic trial (WOMAN trial. All adult women clinically diagnosed with PPH after a vaginal delivery or caesarean section, are eligible for inclusion in the study. Blood samples will be collected at the baseline and 30 minutes after the first dose of study treatment is given. Platelet function will be evaluated in whole blood immediately after sampling with Multiplate® tests (ADPtest and TRAPtest. Thrombin generation, fibrinogen, D-dimer, and coagulation factors vW, V and VIII will be analysed using platelet poor plasma. Results. Recruitment to WOMAN ETAPlaT started on 04 November 2013 and closed on 13 January 2015, during this time 188 patients were recruited. The final participant follow-up was completed on 04 March 2015. This article introduces the statistical analysis plan for the study, without reference to unblinded data. Conclusion. The data from this study will provide evidence for the effect of TXA on thrombin generation, platelet function and coagulation factors in women with PPH. Trial registration: ClinicalTrials.gov Identifier: NCT00872469; ISRCTN76912190
Directory of Open Access Journals (Sweden)
Wenzhi Wang
2016-07-01
Full Text Available Modeling the random fiber distribution of a fiber-reinforced composite is of great importance for studying the progressive failure behavior of the material on the micro scale. In this paper, we develop a new algorithm for generating random representative volume elements (RVEs with statistical equivalent fiber distribution against the actual material microstructure. The realistic statistical data is utilized as inputs of the new method, which is archived through implementation of the probability equations. Extensive statistical analysis is conducted to examine the capability of the proposed method and to compare it with existing methods. It is found that the proposed method presents a good match with experimental results in all aspects including the nearest neighbor distance, nearest neighbor orientation, Ripley’s K function, and the radial distribution function. Finite element analysis is presented to predict the effective elastic properties of a carbon/epoxy composite, to validate the generated random representative volume elements, and to provide insights of the effect of fiber distribution on the elastic properties. The present algorithm is shown to be highly accurate and can be used to generate statistically equivalent RVEs for not only fiber-reinforced composites but also other materials such as foam materials and particle-reinforced composites.
Swiss electricity statistics 2008
International Nuclear Information System (INIS)
2009-06-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics for 2008 on electricity production and usage in Switzerland for the year 2008. First of all, an overview of Switzerland's electricity supply in 2008 is presented. Details are noted of the proportions generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2008. For the summer and winter periods, figures from 1995 to 2008 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2008, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1984 to 2008 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2015
Swiss electricity statistics 2005
International Nuclear Information System (INIS)
2006-01-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics for 2005 on electricity production and usage in Switzerland for the year 2005. First of all, an overview of Switzerland's electricity supply in 2005 is presented. Details are noted of the proportions generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2005. For the summer and winter periods, figures from 1995 to 2005 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2005, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1983 to 2005 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2012
Swiss electricity statistics 2006
International Nuclear Information System (INIS)
2007-01-01
This comprehensive report made by the Swiss Federal Office of Energy (SFOE) presents the statistics on electricity production and usage in Switzerland for the year 2006. First of all, an overview of Switzerland's electricity supply in 2006 is presented. Details are noted of the amounts generated by different sources including nuclear, hydro-power, storage schemes and thermal power stations as well as energy transfer with neighbouring countries. A second chapter takes a look at the balance of imports and exports with illustrative flow diagrams along with tables for total figures from 1950 through to 2006. For the summer and winter periods, figures from 1995 to 2006 are presented. The third chapter examines the production of electricity in the various types of power stations and the developments over the years 1950 to 2006, whereby, for example, statistics on regional generation and power station type are looked at. The fourth chapter looks at electricity consumption in various sectors from 1983 to 2006 and compares the figures with international data. The fifth chapter looks at generation, consumption and loading on particular, selected days and chapter six considers energy exchange with Switzerland's neighbours. Chapter seven takes a look at possibilities for extending generation facilities in the period up to 2013
Role-playing for more realistic technical skills training.
Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J
2005-03-01
Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.
Realistic minimum accident source terms - Evaluation, application, and risk acceptance
International Nuclear Information System (INIS)
Angelo, P. L.
2009-01-01
The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)
Realistic electricity market simulator for energy and economic studies
International Nuclear Information System (INIS)
Bernal-Agustin, Jose L.; Contreras, Javier; Conejo, Antonio J.; Martin-Flores, Raul
2007-01-01
Electricity market simulators have become a useful tool to train engineers in the power industry. With the maturing of electricity markets throughout the world, there is a need for sophisticated software tools that can replicate the actual behavior of power markets. In most of these markets, power producers/consumers submit production/demand bids and the Market Operator clears the market producing a single price per hour. What makes markets different from each other are the bidding rules and the clearing algorithms to balance the market. This paper presents a realistic simulator of the day-ahead electricity market of mainland Spain. All the rules that govern this market are modeled. This simulator can be used either to train employees by power companies or to teach electricity markets courses in universities. To illustrate the tool, several realistic case studies are presented and discussed. (author)
Facilities upgrade for natural forces: traditional vs. realistic approach
International Nuclear Information System (INIS)
Terkun, V.
1985-01-01
The traditional method utilized for upgrading existing buildings and equipment involves the following steps: performs structural study using finite element analysis and some in situ testing; compare predicted member forces/stresses to material code allowables; determine strengthening schemes for those structural members judged to be weak; estimate cost for required upgrades. This approach will result in structural modifications that are not only conservative but very expensive as well. The realistic structural evaluation approach uses traditional data to predict structural weaknesses as a final step. Next, using considerable information now available for buildings and equipment exposed to natural hazards, engineering judgments about structures being evaluated can be made with a great deal of confidence. This approach does not eliminate conservatism entirely, but it does reduce it to a reasonable and realistic level. As a result, the upgrade cost goes down without compromising the low risk necessary for vital facilities
Realistic full wave modeling of focal plane array pixels.
Energy Technology Data Exchange (ETDEWEB)
Campione, Salvatore [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Warne, Larry K. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Jorgenson, Roy E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Electromagnetic Theory Dept.; Davids, Paul [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.; Peters, David W. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States). Applied Photonic Microsystems Dept.
2017-11-01
Here, we investigate full-wave simulations of realistic implementations of multifunctional nanoantenna enabled detectors (NEDs). We focus on a 2x2 pixelated array structure that supports two wavelengths of operation. We design each resonating structure independently using full-wave simulations with periodic boundary conditions mimicking the whole infinite array. We then construct a supercell made of a 2x2 pixelated array with periodic boundary conditions mimicking the full NED; in this case, however, each pixel comprises 10-20 antennas per side. In this way, the cross-talk between contiguous pixels is accounted for in our simulations. We observe that, even though there are finite extent effects, the pixels work as designed, each responding at the respective wavelength of operation. This allows us to stress that realistic simulations of multifunctional NEDs need to be performed to verify the design functionality by taking into account finite extent and cross-talk effects.
Tube problems: worldwide statistics reviewed
International Nuclear Information System (INIS)
Anon.
1994-01-01
EPRI's Steam Generator Strategic Management Project issues an annual report on the progress being made in tackling steam generator problems worldwide, containing a wealth of detailed statistics on the status of operating units and degradation mechanisms encountered. A few highlights are presented from the latest report, issued in October 1993, which covers the period to 31 December 1992. (Author)
Fully Realistic Multi-Criteria Multi-Modal Routing
Gündling, Felix; Keyhani, Mohammad Hossein; Schnee, Mathias; Weihe, Karsten
2014-01-01
We report on a multi-criteria search system, in which the German long- and short-distance trains, local public transport, walking, private car, private bike, and taxi are incorporated. The system is fully realistic. Three optimization criteria are addressed: travel time, travel cost, and convenience. Our algorithmic approach computes a complete Pareto set of reasonable connections. The computational study demonstrates that, even in such a large-scale, highly complex scenario, approp...
Realistic modeling of chamber transport for heavy-ion fusion
International Nuclear Information System (INIS)
Sharp, W.M.; Grote, D.P.; Callahan, D.A.; Tabak, M.; Henestroza, E.; Yu, S.S.; Peterson, P.F.; Welch, D.R.; Rose, D.V.
2003-01-01
Transport of intense heavy-ion beams to an inertial-fusion target after final focus is simulated here using a realistic computer model. It is found that passing the beam through a rarefied plasma layer before it enters the fusion chamber can largely neutralize the beam space charge and lead to a usable focal spot for a range of ion species and input conditions
Bell Operator Method to Classify Local Realistic Theories
International Nuclear Information System (INIS)
Nagata, Koji
2010-01-01
We review the historical fact of multipartite Bell inequalities with an arbitrary number of settings. An explicit local realistic model for the values of a correlation function, given in a two-setting Bell experiment (two-setting model), works only for the specific set of settings in the given experiment, but cannot construct a local realistic model for the values of a correlation function, given in a continuous-infinite settings Bell experiment (infinite-setting model), even though there exist two-setting models for all directions in space. Hence, the two-setting model does not have the property that the infinite-setting model has. Here, we show that an explicit two-setting model cannot construct a local realistic model for the values of a correlation function, given in an M-setting Bell experiment (M-setting model), even though there exist two-setting models for the M measurement directions chosen in the given M-setting experiment. Hence, the two-setting model does not have the property that the M-setting model has. (general)
I-Love relations for incompressible stars and realistic stars
Chan, T. K.; Chan, AtMa P. O.; Leung, P. T.
2015-02-01
In spite of the diversity in the equations of state of nuclear matter, the recently discovered I-Love-Q relations [Yagi and Yunes, Science 341, 365 (2013), 10.1126/science.1236462], which relate the moment of inertia, tidal Love number (deformability), and the spin-induced quadrupole moment of compact stars, hold for various kinds of realistic neutron stars and quark stars. While the physical origin of such universality is still a current issue, the observation that the I-Love-Q relations of incompressible stars can well approximate those of realistic compact stars hints at a new direction to approach the problem. In this paper, by establishing recursive post-Minkowskian expansion for the moment of inertia and the tidal deformability of incompressible stars, we analytically derive the I-Love relation for incompressible stars and show that the so-obtained formula can be used to accurately predict the behavior of realistic compact stars from the Newtonian limit to the maximum mass limit.
Realistic ion optical transfer maps for Super-FRS magnets from numerical field data
Energy Technology Data Exchange (ETDEWEB)
Kazantseva, Erika; Boine-Frankenheim, Oliver [Technische Universitaet Darmstadt (Germany)
2016-07-01
In large aperture accelerators such as Super-FRS, the non-linearity of the magnetic field in bending elements leads to the non-linear beam dynamics, which cannot be described by means of linear ion optics. Existing non-linear approach is based on the Fourier harmonics formalism and is not working if horizontal aperture is bigger as vertical or vice versa. In Super-FRS dipole the horizontal aperture is much bigger than the vertical. Hence, it is necessary to find a way to create the higher order transfer map for this dipole to accurately predict the particle dynamics in the realistic magnetic fields in the whole aperture. The aim of this work is to generate an accurate high order transfer map of magnetic elements from measured or simulated 3D magnetic field data. Using differential algebraic formalism allows generating transfer maps automatically via numerical integration of ODEs of motion in beam physics coordinates along the reference path. To make the transfer map accurate for all particles in the beam, the magnetic field along the integration path should be represented by analytical function, matching with the real field distribution in the volume of interest. Within this work the steps of high order realistic transfer map production starting from the field values on closed box, covering the volume of interest, will be analyzed in detail.
Design principles and optimal performance for molecular motors under realistic constraints
Tu, Yuhai; Cao, Yuansheng
2018-02-01
The performance of a molecular motor, characterized by its power output and energy efficiency, is investigated in the motor design space spanned by the stepping rate function and the motor-track interaction potential. Analytic results and simulations show that a gating mechanism that restricts forward stepping in a narrow window in configuration space is needed for generating high power at physiologically relevant loads. By deriving general thermodynamics laws for nonequilibrium motors, we find that the maximum torque (force) at stall is less than its theoretical limit for any realistic motor-track interactions due to speed fluctuations. Our study reveals a tradeoff for the motor-track interaction: while a strong interaction generates a high power output for forward steps, it also leads to a higher probability of wasteful spontaneous back steps. Our analysis and simulations show that this tradeoff sets a fundamental limit to the maximum motor efficiency in the presence of spontaneous back steps, i.e., loose-coupling. Balancing this tradeoff leads to an optimal design of the motor-track interaction for achieving a maximum efficiency close to 1 for realistic motors that are not perfectly coupled with the energy source. Comparison with existing data and suggestions for future experiments are discussed.
Development of a realistic, dynamic digital brain phantom for CT perfusion validation
Divel, Sarah E.; Segars, W. Paul; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.
2016-03-01
Physicians rely on CT Perfusion (CTP) images and quantitative image data, including cerebral blood flow, cerebral blood volume, and bolus arrival delay, to diagnose and treat stroke patients. However, the quantification of these metrics may vary depending on the computational method used. Therefore, we have developed a dynamic and realistic digital brain phantom upon which CTP scans can be simulated based on a set of ground truth scenarios. Building upon the previously developed 4D extended cardiac-torso (XCAT) phantom containing a highly detailed brain model, this work consisted of expanding the intricate vasculature by semi-automatically segmenting existing MRA data and fitting nonuniform rational B-spline surfaces to the new vessels. Using time attenuation curves input by the user as reference, the contrast enhancement in the vessels changes dynamically. At each time point, the iodine concentration in the arteries and veins is calculated from the curves and the material composition of the blood changes to reflect the expected values. CatSim, a CT system simulator, generates simulated data sets of this dynamic digital phantom which can be further analyzed to validate CTP studies and post-processing methods. The development of this dynamic and realistic digital phantom provides a valuable resource with which current uncertainties and controversies surrounding the quantitative computations generated from CTP data can be examined and resolved.
Effects of realistic force feedback in a robotic assisted minimally invasive surgery system.
Moradi Dalvand, Mohsen; Shirinzadeh, Bijan; Nahavandi, Saeid; Smith, Julian
2014-06-01
Robotic assisted minimally invasive surgery systems not only have the advantages of traditional laparoscopic procedures but also restore the surgeon's hand-eye coordination and improve the surgeon's precision by filtering hand tremors. Unfortunately, these benefits have come at the expense of the surgeon's ability to feel. Several research efforts have already attempted to restore this feature and study the effects of force feedback in robotic systems. The proposed methods and studies have some shortcomings. The main focus of this research is to overcome some of these limitations and to study the effects of force feedback in palpation in a more realistic fashion. A parallel robot assisted minimally invasive surgery system (PRAMiSS) with force feedback capabilities was employed to study the effects of realistic force feedback in palpation of artificial tissue samples. PRAMiSS is capable of actually measuring the tip/tissue interaction forces directly from the surgery site. Four sets of experiments using only vision feedback, only force feedback, simultaneous force and vision feedback and direct manipulation were conducted to evaluate the role of sensory feedback from sideways tip/tissue interaction forces with a scale factor of 100% in characterising tissues of varying stiffness. Twenty human subjects were involved in the experiments for at least 1440 trials. Friedman and Wilcoxon signed-rank tests were employed to statistically analyse the experimental results. Providing realistic force feedback in robotic assisted surgery systems improves the quality of tissue characterization procedures. Force feedback capability also increases the certainty of characterizing soft tissues compared with direct palpation using the lateral sides of index fingers. The force feedback capability can improve the quality of palpation and characterization of soft tissues of varying stiffness by restoring sense of touch in robotic assisted minimally invasive surgery operations.
Characteristics of 454 pyrosequencing data--enabling realistic simulation with flowsim.
Balzer, Susanne; Malde, Ketil; Lanzén, Anders; Sharma, Animesh; Jonassen, Inge
2010-09-15
The commercial launch of 454 pyrosequencing in 2005 was a milestone in genome sequencing in terms of performance and cost. Throughout the three available releases, average read lengths have increased to approximately 500 base pairs and are thus approaching read lengths obtained from traditional Sanger sequencing. Study design of sequencing projects would benefit from being able to simulate experiments. We explore 454 raw data to investigate its characteristics and derive empirical distributions for the flow values generated by pyrosequencing. Based on our findings, we implement Flowsim, a simulator that generates realistic pyrosequencing data files of arbitrary size from a given set of input DNA sequences. We finally use our simulator to examine the impact of sequence lengths on the results of concrete whole-genome assemblies, and we suggest its use in planning of sequencing projects, benchmarking of assembly methods and other fields. Flowsim is freely available under the General Public License from http://blog.malde.org/index.php/flowsim/.
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
International Nuclear Information System (INIS)
Won Kim, Chang; Kim, Jong Hyo
2014-01-01
Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in
Diffeomorphic Statistical Deformation Models
DEFF Research Database (Denmark)
Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus
2007-01-01
In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Realistic modeling of seismic input for megacities and large urban areas
International Nuclear Information System (INIS)
Panza, Giuliano F.; Alvarez, Leonardo; Aoudia, Abdelkrim
2002-06-01
The project addressed the problem of pre-disaster orientation: hazard prediction, risk assessment, and hazard mapping, in connection with seismic activity and man-induced vibrations. The definition of realistic seismic input has been obtained from the computation of a wide set of time histories and spectral information, corresponding to possible seismotectonic scenarios for different source and structural models. The innovative modeling technique, that constitutes the common tool to the entire project, takes into account source, propagation and local site effects. This is done using first principles of physics about wave generation and propagation in complex media, and does not require to resort to convolutive approaches, that have been proven to be quite unreliable, mainly when dealing with complex geological structures, the most interesting from the practical point of view. In fact, several techniques that have been proposed to empirically estimate the site effects using observations convolved with theoretically computed signals corresponding to simplified models, supply reliable information about the site response to non-interfering seismic phases. They are not adequate in most of the real cases, when the seismic sequel is formed by several interfering waves. The availability of realistic numerical simulations enables us to reliably estimate the amplification effects even in complex geological structures, exploiting the available geotechnical, lithological, geophysical parameters, topography of the medium, tectonic, historical, palaeoseismological data, and seismotectonic models. The realistic modeling of the ground motion is a very important base of knowledge for the preparation of groundshaking scenarios that represent a valid and economic tool for the seismic microzonation. This knowledge can be very fruitfully used by civil engineers in the design of new seismo-resistant constructions and in the reinforcement of the existing built environment, and, therefore
Generation of realistic scene using illuminant estimation and mixed chromatic adaptation
Kim, Jae-Chul; Hong, Sang-Gi; Kim, Dong-Ho; Park, Jong-Hyun
2003-12-01
The algorithm of combining a real image with a virtual model was proposed to increase the reality of synthesized images. Currently, synthesizing a real image with a virtual model facilitated the surface reflection model and various geometric techniques. In the current methods, the characteristics of various illuminants in the real image are not sufficiently considered. In addition, despite the chromatic adaptation plays a vital role for accommodating different illuminants in the two media viewing conditions, it is not taken into account in the existing methods. Thus, it is hardly to get high-quality synthesized images. In this paper, we proposed the two-phase image synthesis algorithm. First, the surface reflectance of the maximum high-light region (MHR) was estimated using the three eigenvectors obtained from the principal component analysis (PCA) applied to the surface reflectances of 1269 Munsell samples. The combined spectral value, i.e., the product of surface reflectance and the spectral power distributions (SPDs) of an illuminant, of MHR was then estimated using the three eigenvectors obtained from PCA applied to the products of surface reflectances of Munsell 1269 samples and the SPDs of four CIE Standard Illuminants (A, C, D50, D65). By dividing the average combined spectral values of MHR by the average surface reflectances of MHR, we could estimate the illuminant of a real image. Second, the mixed chromatic adaptation (S-LMS) using an estimated and an external illuminants was applied to the virtual-model image. For evaluating the proposed algorithm, experiments with synthetic and real scenes were performed. It was shown that the proposed method was effective in synthesizing the real and the virtual scenes under various illuminants.
Evaluation of realistic layouts for next generation on-scalp MEG: spatial information density maps.
Riaz, Bushra; Pfeiffer, Christoph; Schneiderman, Justin F
2017-08-01
While commercial magnetoencephalography (MEG) systems are the functional neuroimaging state-of-the-art in terms of spatio-temporal resolution, MEG sensors have not changed significantly since the 1990s. Interest in newer sensors that operate at less extreme temperatures, e.g., high critical temperature (high-T c ) SQUIDs, optically-pumped magnetometers, etc., is growing because they enable significant reductions in head-to-sensor standoff (on-scalp MEG). Various metrics quantify the advantages of on-scalp MEG, but a single straightforward one is lacking. Previous works have furthermore been limited to arbitrary and/or unrealistic sensor layouts. We introduce spatial information density (SID) maps for quantitative and qualitative evaluations of sensor arrays. SID-maps present the spatial distribution of information a sensor array extracts from a source space while accounting for relevant source and sensor parameters. We use it in a systematic comparison of three practical on-scalp MEG sensor array layouts (based on high-T c SQUIDs) and the standard Elekta Neuromag TRIUX magnetometer array. Results strengthen the case for on-scalp and specifically high-T c SQUID-based MEG while providing a path for the practical design of future MEG systems. SID-maps are furthermore general to arbitrary magnetic sensor technologies and source spaces and can thus be used for design and evaluation of sensor arrays for magnetocardiography, magnetic particle imaging, etc.
Gauge coupling unification in realistic free-fermionic string models
International Nuclear Information System (INIS)
Dienes, K.R.; Faraggi, A.E.
1995-01-01
We discuss the unification of gauge couplings within the framework of a wide class of realistic free-fermionic string models which have appeared in the literature, including the flipped SU(5), SO(6)xSO(4), and various SU(3)xSU(2)xU(1) models. If the matter spectrum below the string scale is that of the Minimal Supersymmetric Standard Model (MSSM), then string unification is in disagreement with experiment. We therefore examine several effects that may modify the minimal string predictions. First, we develop a systematic procedure for evaluating the one-loop heavy string threshold corrections in free-fermionic string models, and we explicitly evaluate these corrections for each of the realistic models. We find that these string threshold corrections are small, and we provide general arguments explaining why such threshold corrections are suppressed in string theory. Thus heavy thresholds cannot resolve the disagreement with experiment. We also study the effect of non-standard hypercharge normalizations, light SUSY thresholds, and intermediate-scale gauge structure, and similarly conclude that these effects cannot resolve the disagreement with low-energy data. Finally, we examine the effects of additional color triplets and electroweak doublets beyond the MSSM. Although not required in ordinary grand unification scenarios, such states generically appear within the context of certain realistic free-fermionic string models. We show that if these states exist at the appropriate thresholds, then the gauge couplings will indeed unify at the string scale. Thus, within these string models, string unification can be in agreement with low-energy data. (orig.)
Performance Analysis of Relays in LTE for a Realistic Suburban Deployment Scenario
DEFF Research Database (Denmark)
Coletti, Claudio; Mogensen, Preben; Irmer, Ralf
2011-01-01
Relays are likely to play an important role in the deployment of Beyond 3G networks, such as LTE-Advanced, thanks to the possibility of effectively extending Macro network coverage and fulfilling the expected high data-rate requirements. Up until now, the relay technology potential and its cost......-effectiveness have been widely investigated in the literature, considering mainly statistical deployment scenarios, like regular networks with uniform traffic distribution. This paper is envisaged to illustrate the performances of different relay technologies (In-Band/Out-band) in a realistic suburban network...... scenario with real Macro site positions, user density map and spectrum band availability. Based on a proposed heuristic deployment algorithm, results show that deploying In-band relays can significantly reduce the user outage if high backhaul link quality is ensured, whereas Out-band relaying and the usage...
Ultra-Reliable Communications in Failure-Prone Realistic Networks
DEFF Research Database (Denmark)
Gerardino, Guillermo Andrés Pocovi; Lauridsen, Mads; Alvarez, Beatriz Soret
2016-01-01
We investigate the potential of different diversity and interference management techniques to achieve the required downlink SINR outage probability for ultra-reliable communications. The evaluation is performed in a realistic network deployment based on site-specific data from a European capital....... Micro and macroscopic diversity techniques are proved to be important enablers of ultra-reliable communications. Particularly, it is shown how a 4x4 MIMO scheme with three orders of macroscopic diversity can achieve the required SINR outage performance. Smaller gains are obtained from interference...
Capturing and reproducing realistic acoustic scenes for hearing research
DEFF Research Database (Denmark)
Marschall, Marton; Buchholz, Jörg
Accurate spatial audio recordings are important for a range of applications, from the creation of realistic virtual sound environments to the evaluation of communication devices, such as hearing instruments and mobile phones. Spherical microphone arrays are particularly well-suited for capturing....... The properties of MOA microphone layouts and processing were investigated further by considering several order combinations. It was shown that the performance for horizontal vs. elevated sources can be adjusted by varying the order combination, but that a benefit of the higher horizontal orders can only be seen...
Building Realistic Mobility Models for Mobile Ad Hoc Networks
Directory of Open Access Journals (Sweden)
Adrian Pullin
2018-04-01
Full Text Available A mobile ad hoc network (MANET is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR, destination-sequenced distance-vector routing (DSDV, and ad hoc n-demand distance vector routing (AODV. The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To
Scaling up complex interventions: insights from a realist synthesis.
Willis, Cameron D; Riley, Barbara L; Stockton, Lisa; Abramowicz, Aneta; Zummach, Dana; Wong, Geoff; Robinson, Kerry L; Best, Allan
2016-12-19
Preventing chronic diseases, such as cancer, cardiovascular disease and diabetes, requires complex interventions, involving multi-component and multi-level efforts that are tailored to the contexts in which they are delivered. Despite an increasing number of complex interventions in public health, many fail to be 'scaled up'. This study aimed to increase understanding of how and under what conditions complex public health interventions may be scaled up to benefit more people and populations.A realist synthesis was conducted and discussed at an in-person workshop involving practitioners responsible for scaling up activities. Realist approaches view causality through the linkages between changes in contexts (C) that activate mechanisms (M), leading to specific outcomes (O) (CMO configurations). To focus this review, three cases of complex interventions that had been successfully scaled up were included: Vibrant Communities, Youth Build USA and Pathways to Education. A search strategy of published and grey literature related to each case was developed, involving searches of relevant databases and nominations from experts. Data extracted from included documents were classified according to CMO configurations within strategic themes. Findings were compared and contrasted with guidance from diffusion theory, and interpreted with knowledge users to identify practical implications and potential directions for future research.Four core mechanisms were identified, namely awareness, commitment, confidence and trust. These mechanisms were activated within two broad scaling up strategies, those of renewing and regenerating, and documenting success. Within each strategy, specific actions to change contexts included building partnerships, conducting evaluations, engaging political support and adapting funding models. These modified contexts triggered the identified mechanisms, leading to a range of scaling up outcomes, such as commitment of new communities, changes in relevant
Dynamic apeerture in damping rings with realistic wigglers
Energy Technology Data Exchange (ETDEWEB)
Cai, Yunhai; /SLAC
2005-05-04
The International Linear Collider based on superconducting RF cavities requires the damping rings to have extremely small equilibrium emittance, huge circumference, fast damping time, and large acceptance. To achieve all of these requirements is a very challenging task. In this paper, we will present a systematic approach to designing the damping rings using simple cells and non-interlaced sextupoles. The designs of the damping rings with various circumferences and shapes, including dogbone, are presented. To model realistic wigglers, we have developed a new hybrid symplectic integrator for faster and accurate evaluation of dynamic aperture of the lattices.
Dynamic Enhanced Inter-Cell Interference Coordination for Realistic Networks
DEFF Research Database (Denmark)
Pedersen, Klaus I.; Alvarez, Beatriz Soret; Barcos, Sonia
2016-01-01
Enhanced Inter-Cell Interference Coordination (eICIC) is a key ingredient to boost the performance of co-channel Heterogeneous Networks (HetNets). eICIC encompasses two main techniques: Almost Blank Subframes (ABS), during which the macro cell remains silent to reduce the interference, and biased...... and an opportunistic approach exploiting the varying cell conditions. Moreover, an autonomous fast distributed muting algorithm is presented, which is simple, robust, and well suited for irregular network deployments. Performance results for realistic network deployments show that the traditional semi-static e...
Realistic shell-model calculations for Sn isotopes
International Nuclear Information System (INIS)
Covello, A.; Andreozzi, F.; Coraggio, L.; Gargano, A.; Porrino, A.
1997-01-01
We report on a shell-model study of the Sn isotopes in which a realistic effective interaction derived from the Paris free nucleon-nucleon potential is employed. The calculations are performed within the framework of the seniority scheme by making use of the chain-calculation method. This provides practically exact solutions while cutting down the amount of computational work required by a standard seniority-truncated calculation. The behavior of the energy of several low-lying states in the isotopes with A ranging from 122 to 130 is presented and compared with the experimental one. (orig.)
Turbulence studies in tokamak boundary plasmas with realistic divertor geometry
International Nuclear Information System (INIS)
Xu, X.Q.; Cohen, R.H.; Porter, G.D.; Rognlien, T.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.
2001-01-01
Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the ExB drift speed, ion diamagnetism and nite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)
Turbulence studies in tokamak boundary plasmas with realistic divertor geometry
International Nuclear Information System (INIS)
Xu, X.Q.; Cohen, R.H.; Por, G.D. ter; Rognlien, T.D.; Ryutov, D.D.; Myra, J.R.; D'Ippolito, D.A.; Moyer, R.; Groebner, R.J.
1999-01-01
Results are presented from the 3D nonlocal electromagnetic turbulence code BOUT and the linearized shooting code BAL for studies of turbulence in tokamak boundary plasmas and its relationship to the L-H transition, in a realistic divertor plasma geometry. The key results include: (1) the identification of the dominant resistive X-point mode in divertor geometry and (2) turbulence suppression in the L-H transition by shear in the E x B drift speed, ion diamagnetism and finite polarization. Based on the simulation results, a parameterization of the transport is given that includes the dependence on the relevant physical parameters. (author)
On Small Antenna Measurements in a Realistic MIMO Scenario
DEFF Research Database (Denmark)
Yanakiev, Boyan; Nielsen, Jesper Ødum; Pedersen, Gert Frølund
2010-01-01
. The problem using coaxial cable is explained and a solution suitable for long distance channel sounding is presented. A large scale measurement campaign is then described. Special attention is paid to bring the measurement setup as close as possible to a realistic LTE network of the future, with attention......This paper deals with the challenges related to evaluating the performance of multiple, small terminal antennas within a natural MIMO environment. The focus is on the antenna measurement accuracy. First a method is presented for measuring small phone mock-ups, with the use of optical fibers...
A continuous family of realistic SUSY SU(5) GUTs
Energy Technology Data Exchange (ETDEWEB)
Bajc, Borut, E-mail: borut.bajc@ijs.si [J. Stefan Institute, Jamova cesta 39, 1000, Ljubljana (Slovenia)
2016-06-21
It is shown that the minimal renormalizable supersymmetric SU(5) is still realistic providing the supersymmetric scale is at least few tens of TeV or large R-parity violating terms are considered. In the first case the vacuum is metastable, and different consistency constraints can give a bounded allowed region in the tan β − m{sub susy} plane. In the second case the mass eigenstate electron (down quark) is a linear combination of the original electron (down quark) and Higgsino (heavy colour triplet), and the mass ratio of bino and wino is determined. Both limits lead to light gravitino dark matter.
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
International Nuclear Information System (INIS)
Gupta, Ankit; Chauhan, Yogesh K.
2016-01-01
In recent years, solar energy has been considered as one of the principle renewable energy source for electric power generation. In this paper, single diode photovoltaic (PV) system and double/bypass diode based PV system are designed in MATLAB/Simulink environment based on their mathematical modeling and are validated with a commercially available solar panel. The novelty of the paper is to include the effect of climatic conditions i.e. variable irradiation level, wind speed, temperature, humidity level and dust accumulation in the modeling of both the PV systems to represent a realistic PV system. The comprehensive investigations are made on both the modeled PV systems. The obtained results show the satisfactory performance for realistic models of the PV system. Furthermore, an in depth comparative analysis is carried out for both PV systems. - Highlights: • Modeling of Single diode and Double diode PV systems in MATLAB/Simulink software. • Validation of designed PV systems with a commercially available PV panel. • Acquisition and employment of key climatic factors in modeling of the PV systems. • Evaluation of main model parameters of both the PV systems. • Detailed comparative assessment of both the modeled PV system parameters.
Thermohydraulic simulation of HTR-10 nuclear reactor core using realistic CFD approach
International Nuclear Information System (INIS)
Silva, Alexandro S.; Dominguez, Dany S.; Mazaira, Leorlen Y. Rojas; Hernandez, Carlos R.G.; Lira, Carlos Alberto Brayner de Oliveira
2015-01-01
High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal–hydraulic characteristics. In this article, it was performed the thermal–hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a column of FCC (Face Centered Cubic) cells, with 41 layers and 82 pebbles. The input data used were taken from the thermohydraulic IAEA Benchmark (TECDOC-1694). The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)
Energy Technology Data Exchange (ETDEWEB)
Silva, Alexandro S., E-mail: alexandrossilva@ifba.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia da Bahia (IFBA), Vitoria da Conquista, BA (Brazil); Mazaira, Leorlen Y.R., E-mail: leored1984@gmail.com, E-mail: cgh@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas (INSTEC), La Habana (Cuba); Dominguez, Dany S.; Hernandez, Carlos R.G., E-mail: alexandrossilva@gmail.com, E-mail: dsdominguez@gmail.com [Universidade Estadual de Santa Cruz (UESC), Ilheus, BA (Brazil). Programa de Pos-Graduacao em Modelagem Computacional; Lira, Carlos A.B.O., E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil)
2015-07-01
High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal-hydraulic characteristics. In this article, it was performed the thermal-hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a FCC (Face Centered Cubic) cell with the half height of the core, with 21 layers and 95 pebbles. The input data used were taken from the thermal-hydraulic IAEA Bechmark. The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)
International Nuclear Information System (INIS)
Silva, Alexandro S.; Mazaira, Leorlen Y.R.; Dominguez, Dany S.; Hernandez, Carlos R.G.
2015-01-01
High-temperature gas-cooled reactors (HTGRs) have the potential to be used as possible energy generation sources in the near future, owing to their inherently safe performance by using a large amount of graphite, low power density design, and high conversion efficiency. However, safety is the most important issue for its commercialization in nuclear energy industry. It is very important for safety design and operation of an HTGR to investigate its thermal-hydraulic characteristics. In this article, it was performed the thermal-hydraulic simulation of compressible flow inside the core of the pebble bed reactor HTR (High Temperature Reactor)-10 using Computational Fluid Dynamics (CFD). The realistic approach was used, where every closely packed pebble is realistically modelled considering a graphite layer and sphere of fuel. Due to the high computational cost is impossible simulate the full core; therefore, the geometry used is a FCC (Face Centered Cubic) cell with the half height of the core, with 21 layers and 95 pebbles. The input data used were taken from the thermal-hydraulic IAEA Bechmark. The results show the profiles of velocity and temperature of the coolant in the core, and the temperature distribution inside the pebbles. The maximum temperatures in the pebbles do not exceed the allowable limit for this type of nuclear fuel. (author)
Development of realistic thermal hydraulic system analysis code
Energy Technology Data Exchange (ETDEWEB)
Lee, Won Jae; Chung, B. D; Kim, K. D. [and others
2002-05-01
The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others.
Development of realistic thermal hydraulic system analysis code
International Nuclear Information System (INIS)
Lee, Won Jae; Chung, B. D; Kim, K. D.
2002-05-01
The realistic safety analysis system is essential for nuclear safety research, advanced reactor development, safety analysis in nuclear industry and 'in-house' plant design capability development. In this project, we have developed a best-estimate multi-dimensional thermal-hydraulic system code, MARS, which is based on the integrated version of the RELAP5 and COBRA-TF codes. To improve the realistic analysis capability, we have improved the models for multi-dimensional two-phase flow phenomena and for advanced two-phase flow modeling. In addition, the GUI (Graphic User Interface) feature were developed to enhance the user's convenience. To develop the coupled analysis capability, the MARS code were linked with the three-dimensional reactor kinetics code (MASTER), the core thermal analysis code (COBRA-III/CP), and the best-estimate containment analysis code (CONTEMPT), resulting in MARS/MASTER/COBRA/CONTEMPT. Currently, the MARS code system has been distributed to 18 domestic organizations, including research, industrial, regulatory organizations and universities. The MARS has been being widely used for the safety research of existing PWRs, advanced PWR, CANDU and research reactor, the pre-test analysis of TH experiments, and others
Music therapy for palliative care: A realist review.
McConnell, Tracey; Porter, Sam
2017-08-01
Music therapy has experienced a rising demand as an adjunct therapy for symptom management among palliative care patients. We conducted a realist review of the literature to develop a greater understanding of how music therapy might benefit palliative care patients and the contextual mechanisms that promote or inhibit its successful implementation. We searched electronic databases (CINAHL, Embase, Medline, and PsychINFO) for literature containing information on music therapy for palliative care. In keeping with the realist approach, we examined all relevant literature to develop theories that could explain how music therapy works. A total of 51 articles were included in the review. Music therapy was found to have a therapeutic effect on the physical, psychological, emotional, and spiritual suffering of palliative care patients. We also identified program mechanisms that help explain music therapy's therapeutic effects, along with facilitating contexts for implementation. Music therapy may be an effective nonpharmacological approach to managing distressing symptoms in palliative care patients. The findings also suggest that group music therapy may be a cost-efficient and effective way to support staff caring for palliative care patients. We encourage others to continue developing the evidence base in order to expand our understanding of how music therapy works, with the aim of informing and improving the provision of music therapy for palliative care patients.
Report of the workshop on realistic SSC lattices
International Nuclear Information System (INIS)
1985-10-01
A workshop was held at the SSC Central Design Group from May 29 to June 4, 1985, on topics relating to the lattice of the SSC. The workshop marked a shift of emphasis from the investigation of simplified test lattices to the development of a realistic lattice suitable for the conceptual design report. The first day of the workshop was taken up by reviews of accelerator system requirements, of the reference design solutions for these requirements, of lattice work following the reference design, and of plans for the workshop. The work was divided among four working groups. The first, chaired by David Douglas, concerned the arcs of regular cells. The second group, which studied the utility insertions, was chaired by Beat Leemann. The third group, under David E. Johnson, concerned itself with the experimental insertions, dispersion suppressors, and phase trombones. The fourth group, responsible for global lattice considerations and the design of a new realistic lattice example, was led by Ernest Courant. The papers resulting from this workshop are roughly divided into three sets: those relating to specific lattice components, to complete lattices, and to other topics. Among the salient accomplishments of the workshop were additions to and optimization of lattice components, especially those relating to lattices using 1-in-1 magnets, either horizontally or vertically separated, and the design of complete lattice examples. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database
Ultra-realistic 3-D imaging based on colour holography
International Nuclear Information System (INIS)
Bjelkhagen, H I
2013-01-01
A review of recent progress in colour holography is provided with new applications. Colour holography recording techniques in silver-halide emulsions are discussed. Both analogue, mainly Denisyuk colour holograms, and digitally-printed colour holograms are described and their recent improvements. An alternative to silver-halide materials are the panchromatic photopolymer materials such as the DuPont and Bayer photopolymers which are covered. The light sources used to illuminate the recorded holograms are very important to obtain ultra-realistic 3-D images. In particular the new light sources based on RGB LEDs are described. They show improved image quality over today's commonly used halogen lights. Recent work in colour holography by holographers and companies in different countries around the world are included. To record and display ultra-realistic 3-D images with perfect colour rendering are highly dependent on the correct recording technique using the optimal recording laser wavelengths, the availability of improved panchromatic recording materials and combined with new display light sources.
Spectroscopy of light nuclei with realistic NN interaction JISP
International Nuclear Information System (INIS)
Shirokov, A. M.; Vary, J. P.; Mazur, A. I.; Weber, T. A.
2008-01-01
Recent results of our systematic ab initio studies of the spectroscopy of s- and p-shell nuclei in fully microscopic large-scale (up to a few hundred million basis functions) no-core shell-model calculations are presented. A new high-quality realistic nonlocal NN interaction JISP is used. This interaction is obtained in the J-matrix inverse-scattering approach (JISP stands for the J-matrix inverse-scattering potential) and is of the form of a small-rank matrix in the oscillator basis in each of the NN partial waves, providing a very fast convergence in shell-model studies. The current purely two-body JISP model of the nucleon-nucleon interaction JISP16 provides not only an excellent description of two-nucleon data (deuteron properties and np scattering) with χ 2 /datum = 1.05 but also a better description of a wide range of observables (binding energies, spectra, rms radii, quadrupole moments, electromagnetic-transition probabilities, etc.) in all s-and p-shell nuclei than the best modern interaction models combining realistic nucleon-nucleon and three-nucleon interactions.
Development of vortex model with realistic axial velocity distribution
International Nuclear Information System (INIS)
Ito, Kei; Ezure, Toshiki; Ohshima, Hiroyuki
2014-01-01
A vortex is considered as one of significant phenomena which may cause gas entrainment (GE) and/or vortex cavitation in sodium-cooled fast reactors. In our past studies, the vortex is assumed to be approximated by the well-known Burgers vortex model. However, the Burgers vortex model has a simple but unreal assumption that the axial velocity component is horizontally constant, while in real the free surface vortex has the axial velocity distribution which shows large gradient in radial direction near the vortex center. In this study, a new vortex model with realistic axial velocity distribution is proposed. This model is derived from the steady axisymmetric Navier-Stokes equation as well as the Burgers vortex model, but the realistic axial velocity distribution in radial direction is considered, which is defined to be zero at the vortex center and to approach asymptotically to zero at infinity. As the verification, the new vortex model is applied to the evaluation of a simple vortex experiment, and shows good agreements with the experimental data in terms of the circumferential velocity distribution and the free surface shape. In addition, it is confirmed that the Burgers vortex model fails to calculate accurate velocity distribution with the assumption of uniform axial velocity. However, the calculation accuracy of the Burgers vortex model can be enhanced close to that of the new vortex model in consideration of the effective axial velocity which is calculated as the average value only in the vicinity of the vortex center. (author)
Neural Correlates of Realistic and Unrealistic Auditory Space Perception
Directory of Open Access Journals (Sweden)
Akiko Callan
2011-10-01
Full Text Available Binaural recordings can simulate externalized auditory space perception over headphones. However, if the orientation of the recorder's head and the orientation of the listener's head are incongruent, the simulated auditory space is not realistic. For example, if a person lying flat on a bed listens to an environmental sound that was recorded by microphones inserted in ears of a person who was in an upright position, the sound simulates an auditory space rotated 90 degrees to the real-world horizontal axis. Our question is whether brain activation patterns are different between the unrealistic auditory space (ie, the orientation of the listener's head and the orientation of the recorder's head are incongruent and the realistic auditory space (ie, the orientations are congruent. River sounds that were binaurally recorded either in a supine position or in an upright body position were served as auditory stimuli. During fMRI experiments, participants listen to the stimuli and pressed one of two buttons indicating the direction of the water flow (horizontal/vertical. Behavioral results indicated that participants could not differentiate between the congruent and the incongruent conditions. However, neuroimaging results showed that the congruent condition activated the planum temporale significantly more than the incongruent condition.
An inexpensive yet realistic model for teaching vasectomy
Directory of Open Access Journals (Sweden)
Taylor M. Coe
2015-04-01
Full Text Available Purpose Teaching the no-scalpel vasectomy is important, since vasectomy is a safe, simple, and cost-effective method of contraception. This minimally invasive vasectomy technique involves delivering the vas through the skin with specialized tools. This technique is associated with fewer complications than the traditional incisional vasectomy (1. One of the most challenging steps is the delivery of the vas through a small puncture in the scrotal skin, and there is a need for a realistic and inexpensive scrotal model for beginning learners to practice this step. Materials and Methods After careful observation using several scrotal models while teaching residents and senior trainees, we developed a simplified scrotal model that uses only three components–bicycle inner tube, latex tubing, and a Penrose drain. Results This model is remarkably realistic and allows learners to practice a challenging step in the no-scalpel vasectomy. The low cost and simple construction of the model allows wide dissemination of training in this important technique. Conclusions We propose a simple, inexpensive model that will enable learners to master the hand movements involved in delivering the vas through the skin while mitigating the risks of learning on patients.
Measurement of time delays in gated radiotherapy for realistic respiratory motions
International Nuclear Information System (INIS)
Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L.
2014-01-01
Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients
Measurement of time delays in gated radiotherapy for realistic respiratory motions
Energy Technology Data Exchange (ETDEWEB)
Chugh, Brige P.; Quirk, Sarah; Conroy, Leigh; Smith, Wendy L., E-mail: Wendy.Smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)
2014-09-15
Purpose: Gated radiotherapy is used to reduce internal motion margins, escalate target dose, and limit normal tissue dose; however, its temporal accuracy is limited. Beam-on and beam-off time delays can lead to treatment inefficiencies and/or geographic misses; therefore, AAPM Task Group 142 recommends verifying the temporal accuracy of gating systems. Many groups use sinusoidal phantom motion for this, under the tacit assumption that use of sinusoidal motion for determining time delays produces negligible error. The authors test this assumption by measuring gating time delays for several realistic motion shapes with increasing degrees of irregularity. Methods: Time delays were measured on a linear accelerator with a real-time position management system (Varian TrueBeam with RPM system version 1.7.5) for seven motion shapes: regular sinusoidal; regular realistic-shape; large (40%) and small (10%) variations in amplitude; large (40%) variations in period; small (10%) variations in both amplitude and period; and baseline drift (30%). Film streaks of radiation exposure were generated for each motion shape using a programmable motion phantom. Beam-on and beam-off time delays were determined from the difference between the expected and observed streak length. Results: For the system investigated, all sine, regular realistic-shape, and slightly irregular amplitude variation motions had beam-off and beam-on time delays within the AAPM recommended limit of less than 100 ms. In phase-based gating, even small variations in period resulted in some time delays greater than 100 ms. Considerable time delays over 1 s were observed with highly irregular motion. Conclusions: Sinusoidal motion shapes can be considered a reasonable approximation to the more complex and slightly irregular shapes of realistic motion. When using phase-based gating with predictive filters even small variations in period can result in time delays over 100 ms. Clinical use of these systems for patients
Overdispersion in nuclear statistics
International Nuclear Information System (INIS)
Semkow, Thomas M.
1999-01-01
The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background
Toward realistic pursuit-evasion using a roadmap-based approach
Rodriguez, Samuel; Denny, Jory; Burgos, Juan; Mahadevan, Aditya; Manavi, Kasra; Murray, Luke; Kodochygov, Anton; Zourntos, Takis; Amato, Nancy M.
2011-01-01
be applied to more realistic scenarios than are typically studied in most previous work, including agents moving in 3D environments such as terrains, multi-story buildings, and dynamic environments. We also support more realistic three-dimensional visibility
Mechanisms for generating froissaron
International Nuclear Information System (INIS)
Glushko, N.I.; Kobylinski, N.A.; Martynov, E.S.; Shelest, V.P.
1982-01-01
From a common point of view, we consider the mechanisms for generating froissaron which arise due to the quasieikonal approximation, the U-matrix approach and the method of continued unitarity. A realistic model for the input pomeron is suggested and the data on high-energy pp-scattering are described. Likeness and difference of asymptotic and preasymptotic regimes for three variants of froissaron are discussed
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Generalized Warburg impedance on realistic self-affine fractals ...
Indian Academy of Sciences (India)
2016-08-26
Aug 26, 2016 ... We analyse the problem of impedance for a diffusion controlled charge transfer process across an irregular interface. These interfacial irregularities are characterized as two class of random fractals: (i) a statistically isotropic self-affine fractals and (ii) a statistically corrugated self-affine fractals.
Generalized Warburg impedance on realistic self-affine fractals
Indian Academy of Sciences (India)
We analyse the problem of impedance for a diffusion controlled charge transfer process across an irregular interface. These interfacial irregularities are characterized as two class of random fractals: (i) a statistically isotropic self-affine fractals and (ii) a statistically corrugated self-affine fractals. The information about the ...
Measurable realistic image-based 3D mapping
Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.
2011-12-01
Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Recent developments in the specification and achievement of realistic neutron calibration fields
International Nuclear Information System (INIS)
Chartier, J.L.; Kluges, H.; Wiegel, B.; Schraube, H.
1997-01-01
In order to calibrate more accurately the neutron dosemeters involved in radiation protection, the concept of 'Realistic Neutron Calibration Fields' is considered as an appropriate alternative solution, making necessary new irradiation facilities which generate well-characterised neutron fields with energy and angular distribution replicating more closely practical workplace conditions. Several experienced laboratories have collaborated on a European project and proposed various approaches which are reviewed in this paper. A short description of the facilities currently in operation is given as well as a few characteristics of the available radiation fields. This description of the state of art is followed by a discussion of the problems to be solved for using such facilities for calibration purposes according to well-specified calibration procedures. (author)
Radioactive waste management in Brazil: a realistic view
International Nuclear Information System (INIS)
Heilbron Filho, Paulo Fernando Lavalle; Perez Guerrero, Jesus Salvador; Xavier, Ana Maria
2014-01-01
The objective of this article is to present a realistic view of the main issues related to the management of radioactive waste in Brazil as well as a comprehensive picture of the regulatory waste management status in the country and internationally. Technical aspects that must be considered to ensure a safe construction of near surface disposal facilities for radioactive waste of low and medium levels of radiation are addressed. Different types of deposits, the basic regulatory issues involving the licensing of these facilities, the development of a financial compensation model for the Brazilian Municipalities where deposits are to be placed, the importance of the participation of the scientific community and society in the process of radioactive waste site selection and disposal, guidance for the application of the basic requirements of safety and radiation protection, the general safety aspects involved and the current actions for the disposal of radioactive waste in Brazil are highlighted. (author)
Finite Time Blowup in a Realistic Food-Chain Model
Parshad, Rana; Ait Abderrahmane, Hamid; Upadhyay, Ranjit Kumar; Kumari, Nitu
2013-01-01
We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.
A Local Realistic Reconciliation of the EPR Paradox
Sanctuary, Bryan
2014-03-01
The exact violation of Bell's Inequalities is obtained with a local realistic model for spin. The model treats one particle that comprises a quantum ensemble and simulates the EPR data one coincidence at a time as a product state. Such a spin is represented by operators σx , iσy ,σz in its body frame rather than the usual set of σX ,σY ,σZ in the laboratory frame. This model, assumed valid in the absence of a measuring probe, contains both quantum polarizations and coherences. Each carries half the EPR correlation, but only half can be measured using coincidence techniques. The model further predicts the filter angles that maximize the spin correlation in EPR experiments.
How to estimate realistic energy savings in Energy Performance Certificates
DEFF Research Database (Denmark)
Wittchen, Kim Bjarne; Altmann, Nagmeh; Berecová, Monika
Given the fact that most MS use fixed or other kinds of default values as boundary condition input for energy performance calculations, it is not surprising that the calculated energy performance differs from the measured energy consumption. As a consequence, the calculated energy savings due...... stationary calculation tools using monthly average values. The optimum solution for energy performance certificates and calculating realistic energy savings is to have two calculations. One calculation, using default values to calculate the label itself, and one with actual input parameters for calculating...... energy performance before and after implementing energy saving measures. Actual values though, may be difficult to identify, so there is a need to make adaptations to reality easy. Even if actual values are available, there are still issues that cause calculated energy savings to differ from the obtained...
Research of shot noise based on realistic nano-MOSFETs
Directory of Open Access Journals (Sweden)
Xiaofei Jia
2017-05-01
Full Text Available Experimental measurements and simulation results have shown that the dominant noise source of current noise changes from thermal noise to shot noise with scaling of MOSFET, and shot noise were suppressed by Fermi and Coulomb interactions. In this paper, Shot noise test system is established, and experimental results proved that shot noise were suppressed, and the expressions of shot noise in realistic nano-MOSFETs are derived with considering Fermi effect, Coulomb interaction and the combination of the both co-existence, respectively. On this basis, the variation of shot noise with voltage, temperature and source-drain doping were researched. The results we obtained are consistent with those from experiments and the theoretically explanation is given. At the same time, the shot noise test system is suitable for traditional nanoscale electronic components; the shot noise model is suitable for nanoscale MOSFET.
Magnetic exchange at realistic CoO/Ni interfaces
Grytsiuk, Sergii
2012-07-30
We study the CoO/Ni interface by first principles calculations. Because the lattice mismatch is large, a realistic description requires a huge supercell. We investigate two interface configurations: in interface 1 the coupling between the Ni and Co atoms is mediated by O, whereas in interface 2 the Ni and Co atoms are in direct contact. We find that the magnetization (including the orbital moment) in interface 1 has a similar value as in bulk Ni but opposite sign, while in interface 2 it grows by 164%. The obtained magnetic moments can be explained by the local atomic environments. In addition, we find effects of charge transfer between the interface atoms. The Co 3d local density of states of interface 2 exhibits surprisingly small deviations from the corresponding bulk result, although the first coordination sphere is no longer octahedral. © Springer-Verlag 2012.
Breaking with fun, educational and realistic learning games
DEFF Research Database (Denmark)
Duus Henriksen, Thomas
2009-01-01
are commonly conceived as means for staging learning processes, and that thinking learning games so has an inhibiting effect in regard to creating learning processes. The paper draws upon a qualitative study of participants' experiences with ‘the EIS Simulation', which is a computer-based learning game......This paper addresses the game conceptions and values that learning games inherit from regular gaming, as well as how they affect the use and development of learning games. Its key points concern the issues of thinking learning games as fun, educative and realistic, which is how learning games...... for teaching change management and change implementation. The EIS is played in groups, who share the game on a computer, and played by making change decisions in order to implement an IT system in an organisation. In this study, alternative participatory incentives, means for creating learning processes...
Realistic control considerations for electromagnetically levitated urban transit vehicles
Energy Technology Data Exchange (ETDEWEB)
Billing, J R
1976-04-01
A discussion is given of realistic control considerations of suspension dynamics and vehicle/guideway interaction for electromagnetically-levitated urban transit vehicles in the context of revenue applications. The emphasis is on safety, reliability, and maintainability rather than performance. An example urban transit system is described, and the following considerations of dynamics and control are examined: stability, magnet force requirements, magnet airgap requirements, vehicle ride, and component failures. It is shown that it is a formidable problem to ensure suspension stability under all conditions; that operation on curves is a critical magnet and control system design case; that operation of the magnets in the non-linear regime is unavoidable and that component failures will be a major problem. However, good vehicle ride is to be expected. It is concluded that magnetic levitation suspension technology requires substantial development effort before it can be considered suitable for revenue operation.
Hydrostatic Equilibria of Rotating Stars with Realistic Equation of State
Yasutake, Nobutoshi; Fujisawa, Kotaro; Okawa, Hirotada; Yamada, Shoichi
Stars rotate generally, but it is a non-trivial issue to obtain hydrostatic equilibria for rapidly rotating stars theoretically, especially for baroclinic cases, in which the pressure depends not only on the density, but also on the temperature and compositions. It is clear that the stellar structures with realistic equation of state are the baroclinic cases, but there are not so many studies for such equilibria. In this study, we propose two methods to obtain hydrostatic equilibria considering rotation and baroclinicity, namely the weak-solution method and the strong-solution method. The former method is based on the variational principle, which is also applied to the calculation of the inhomogeneous phases, known as the pasta structures, in crust of neutron stars. We found this method might break the balance equation locally, then introduce the strong-solution method. Note that our method is formulated in the mass coordinate, and it is hence appropriated for the stellar evolution calculations.
Using Concrete and Realistic Data in Evaluating Initial Visualization Designs
DEFF Research Database (Denmark)
Knudsen, Søren; Pedersen, Jeppe Gerner; Herdal, Thor
2016-01-01
We explore means of designing and evaluating initial visualization ideas, with concrete and realistic data in cases where data is not readily available. Our approach is useful in exploring new domains and avenues for visualization, and contrasts other visualization work, which typically operate...... under the assumption that data has already been collected, and is ready to be visualized. We argue that it is sensible to understand data requirements and evaluate the potential value of visualization before devising means of automatic data collection. We base our exploration on three cases selected...... the design case and problem, the manner in which we collected data, and the findings obtained from evaluations. Afterwards, we describe four factors of our data collection approach, and discuss potential outcomes from it....
Simulating realistic implementations of spin field effect transistor
Gao, Yunfei; Lundstrom, Mark S.; Nikonov, Dmitri E.
2011-04-01
The spin field effect transistor (spinFET), consisting of two ferromagnetic source/drain contacts and a Si channel, is predicted to have outstanding device and circuit performance. We carry out a rigorous numerical simulation of the spinFET based on the nonequilibrium Green's function formalism self-consistently coupled with a Poisson solver to produce the device I-V characteristics. Good agreement with the recent experiments in terms of spin injection, spin transport, and the magnetoresistance ratio (MR) is obtained. We include factors crucial for realistic devices: tunneling through a dielectric barrier, and spin relaxation at the interface and in the channel. Using these simulations, we suggest ways of optimizing the device. We propose that by choosing the right contact material and inserting tunnel oxide barriers between the source/drain and channel to filter different spins, the MR can be restored to ˜2000%, which would be beneficial to the reconfigurable logic circuit application.
Is islet transplantation a realistic approach to curing diabetes?
Jin, Sang-Man; Kim, Kwang-Won
2017-01-01
Since the report of type 1 diabetes reversal in seven consecutive patients by the Edmonton protocol in 2000, pancreatic islet transplantation has been reappraised based on accumulated clinical evidence. Although initially expected to therapeutically target long-term insulin independence, islet transplantation is now indicated for more specific clinical benefits. With the long-awaited report of the first phase 3 clinical trial in 2016, allogeneic islet transplantation is now transitioning from an experimental to a proven therapy for type 1 diabetes with problematic hypoglycemia. Islet autotransplantation has already been therapeutically proven in chronic pancreatitis with severe abdominal pain refractory to conventional treatments, and it holds promise for preventing diabetes after partial pancreatectomy due to benign pancreatic tumors. Based on current evidence, this review focuses on islet transplantation as a realistic approach to treating diabetes.
Realistic limitations of detecting planets around young active stars
Directory of Open Access Journals (Sweden)
Pinfield D.
2013-04-01
Full Text Available Current planet hunting methods using the radial velocity method are limited to observing middle-aged main-sequence stars where the signatures of stellar activity are much less than on young stars that have just arrived on the main-sequence. In this work we apply our knowledge from the surface imaging of these young stars to place realistic limitations on the possibility of detecting orbiting planets. In general we find that the magnitude of the stellar jitter is directly proportional to the stellar vsini. For G and K dwarfs, we find that it is possible, for models with high stellar activity and low stellar vsini, to be able to detect a 1 MJupiter mass planet within 50 epochs of observations and for the M dwarfs it is possible to detect a habitable zone Earth-like planet in 10s of observational epochs.
Resolving conflict realistically in today's health care environment.
Smith, S B; Tutor, R S; Phillips, M L
2001-11-01
Conflict is a natural part of human interaction, and when properly addressed, results in improved interpersonal relationships and positive organizational culture. Unchecked conflict may escalate to verbal and physical violence. Conflict that is unresolved creates barriers for people, teams, organizational growth, and productivity, leading to cultural disintegration within the establishment. By relying on interdependence and professional collaboration, all parties involved grow and, in turn, benefit the organization and population served. When used in a constructive manner, conflict resolution can help all parties involved see the whole picture, thus allowing freedom for growth and change. Conflict resolution is accomplished best when emotions are controlled before entering into negotiation. Positive confrontation, problem solving, and negotiation are processes used to realistically resolve conflict. Everyone walks away a winner when conflict is resolved in a positive, professional manner (Stone, 1999).
Realistic electrostatic potentials in a neutron star crust
International Nuclear Information System (INIS)
Ebel, Claudio; Mishustin, Igor; Greiner, Walter
2015-01-01
We study the electrostatic properties of inhomogeneous nuclear matter which can be formed in the crusts of neutron stars or in supernova explosions. Such matter is represented by Wigner–Seitz cells of different geometries (spherical, cylindrical, cartesian), which contain nuclei, free neutrons and electrons under the conditions of electrical neutrality. Using the Thomas–Fermi approximation, we have solved the Poisson equation for the electrostatic potential and calculated the corresponding electron density distributions in individual cells. The calculations are done for different shapes and sizes of the cells and different average baryon densities. The electron-to-baryon fraction was fixed at 0.3. Using realistic electron distributions leads to a significant reduction in electrostatic energy and electron chemical potential. (paper)
Finite Time Blowup in a Realistic Food-Chain Model
Parshad, Rana
2013-05-19
We investigate a realistic three-species food-chain model, with generalist top predator. The model based on a modified version of the Leslie-Gower scheme incorporates mutual interference in all the three populations and generalizes several other known models in the ecological literature. We show that the model exhibits finite time blowup in certain parameter range and for large enough initial data. This result implies that finite time blowup is possible in a large class of such three-species food-chain models. We propose a modification to the model and prove that the modified model has globally existing classical solutions, as well as a global attractor. We reconstruct the attractor using nonlinear time series analysis and show that it pssesses rich dynamics, including chaos in certain parameter regime, whilst avoiding blowup in any parameter regime. We also provide estimates on its fractal dimension as well as provide numerical simulations to visualise the spatiotemporal chaos.
Electron distribution in polar heterojunctions within a realistic model
Energy Technology Data Exchange (ETDEWEB)
Tien, Nguyen Thanh, E-mail: thanhtienctu@gmail.com [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Thao, Dinh Nhu [Center for Theoretical and Computational Physics, College of Education, Hue University, 34 Le Loi Street, Hue City (Viet Nam); Thao, Pham Thi Bich [College of Natural Science, Can Tho University, 3-2 Road, Can Tho City (Viet Nam); Quang, Doan Nhat [Institute of Physics, Vietnamese Academy of Science and Technology, 10 Dao Tan Street, Hanoi (Viet Nam)
2015-12-15
We present a theoretical study of the electron distribution, i.e., two-dimensional electron gas (2DEG) in polar heterojunctions (HJs) within a realistic model. The 2DEG is confined along the growth direction by a triangular quantum well with a finite potential barrier and a bent band figured by all confinement sources. Therein, interface polarization charges take a double role: they induce a confining potential and, furthermore, they can make some change in other confinements, e.g., in the Hartree potential from ionized impurities and 2DEG. Confinement by positive interface polarization charges is necessary for the ground state of 2DEG existing at a high sheet density. The 2DEG bulk density is found to be increased in the barrier, so that the scattering occurring in this layer (from interface polarization charges and alloy disorder) becomes paramount in a polar modulation-doped HJ.
Magnetic exchange at realistic CoO/Ni interfaces
Grytsyuk, Sergiy; Cossu, Fabrizio; Schwingenschlö gl, Udo
2012-01-01
We study the CoO/Ni interface by first principles calculations. Because the lattice mismatch is large, a realistic description requires a huge supercell. We investigate two interface configurations: in interface 1 the coupling between the Ni and Co atoms is mediated by O, whereas in interface 2 the Ni and Co atoms are in direct contact. We find that the magnetization (including the orbital moment) in interface 1 has a similar value as in bulk Ni but opposite sign, while in interface 2 it grows by 164%. The obtained magnetic moments can be explained by the local atomic environments. In addition, we find effects of charge transfer between the interface atoms. The Co 3d local density of states of interface 2 exhibits surprisingly small deviations from the corresponding bulk result, although the first coordination sphere is no longer octahedral. © Springer-Verlag 2012.
Radioactive waste management in Brazil: a realistic view
Energy Technology Data Exchange (ETDEWEB)
Heilbron Filho, Paulo Fernando Lavalle; Perez Guerrero, Jesus Salvador, E-mail: paulo@cnen.gov.br, E-mail: jperez@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Xavier, Ana Maria, E-mail: axavier@cnen.gov.br [Comissao Nacional de Energia Nuclear (ESPOA/CNEN-RS), Porto Alegre, RS (Brazil)
2014-07-01
The objective of this article is to present a realistic view of the main issues related to the management of radioactive waste in Brazil as well as a comprehensive picture of the regulatory waste management status in the country and internationally. Technical aspects that must be considered to ensure a safe construction of near surface disposal facilities for radioactive waste of low and medium levels of radiation are addressed. Different types of deposits, the basic regulatory issues involving the licensing of these facilities, the development of a financial compensation model for the Brazilian Municipalities where deposits are to be placed, the importance of the participation of the scientific community and society in the process of radioactive waste site selection and disposal, guidance for the application of the basic requirements of safety and radiation protection, the general safety aspects involved and the current actions for the disposal of radioactive waste in Brazil are highlighted. (author)
Swiss electricity statistics 2000
International Nuclear Information System (INIS)
2001-01-01
This publication by the Association of Swiss Electricity Enterprises for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity production, trading and consumption in Switzerland in 2000. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2000, the data being supplied for each hydrological year and the summer and winter seasons respectively. The production of power in Switzerland is examined in detail. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2000 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The final two chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity
Modeling and Analysis of Realistic Fire Scenarios in Spacecraft
Brooker, J. E.; Dietrich, D. L.; Gokoglu, S. A.; Urban, D. L.; Ruff, G. A.
2015-01-01
An accidental fire inside a spacecraft is an unlikely, but very real emergency situation that can easily have dire consequences. While much has been learned over the past 25+ years of dedicated research on flame behavior in microgravity, a quantitative understanding of the initiation, spread, detection and extinguishment of a realistic fire aboard a spacecraft is lacking. Virtually all combustion experiments in microgravity have been small-scale, by necessity (hardware limitations in ground-based facilities and safety concerns in space-based facilities). Large-scale, realistic fire experiments are unlikely for the foreseeable future (unlike in terrestrial situations). Therefore, NASA will have to rely on scale modeling, extrapolation of small-scale experiments and detailed numerical modeling to provide the data necessary for vehicle and safety system design. This paper presents the results of parallel efforts to better model the initiation, spread, detection and extinguishment of fires aboard spacecraft. The first is a detailed numerical model using the freely available Fire Dynamics Simulator (FDS). FDS is a CFD code that numerically solves a large eddy simulation form of the Navier-Stokes equations. FDS provides a detailed treatment of the smoke and energy transport from a fire. The simulations provide a wealth of information, but are computationally intensive and not suitable for parametric studies where the detailed treatment of the mass and energy transport are unnecessary. The second path extends a model previously documented at ICES meetings that attempted to predict maximum survivable fires aboard space-craft. This one-dimensional model implies the heat and mass transfer as well as toxic species production from a fire. These simplifications result in a code that is faster and more suitable for parametric studies (having already been used to help in the hatch design of the Multi-Purpose Crew Vehicle, MPCV).
Munar, Wolfgang; Wahid, Syed S; Curry, Leslie
2018-01-03
Background . Improving performance of primary care systems in low- and middle-income countries (LMICs) may be a necessary condition for achievement of universal health coverage in the age of Sustainable Development Goals. The Salud Mesoamerica Initiative (SMI), a large-scale, multi-country program that uses supply-side financial incentives directed at the central-level of governments, and continuous, external evaluation of public, health sector performance to induce improvements in primary care performance in eight LMICs. This study protocol seeks to explain whether and how these interventions generate program effects in El Salvador and Honduras. Methods . This study presents the protocol for a study that uses a realist evaluation approach to develop a preliminary program theory that hypothesizes the interactions between context, interventions and the mechanisms that trigger outcomes. The program theory was completed through a scoping review of relevant empirical, peer-reviewed and grey literature; a sense-making workshop with program stakeholders; and content analysis of key SMI documents. The study will use a multiple case-study design with embedded units with contrasting cases. We define as a case the two primary care systems of Honduras and El Salvador, each with different context characteristics. Data will be collected through in-depth interviews with program actors and stakeholders, documentary review, and non-participatory observation. Data analysis will use inductive and deductive approaches to identify causal patterns organized as 'context, mechanism, outcome' configurations. The findings will be triangulated with existing secondary, qualitative and quantitative data sources, and contrasted against relevant theoretical literature. The study will end with a refined program theory. Findings will be published following the guidelines generated by the Realist and Meta-narrative Evidence Syntheses study (RAMESES II). This study will be performed
A realistic validation study of a new nitrogen multiple-breath washout system.
Directory of Open Access Journals (Sweden)
Florian Singer
Full Text Available BACKGROUND: For reliable assessment of ventilation inhomogeneity, multiple-breath washout (MBW systems should be realistically validated. We describe a new lung model for in vitro validation under physiological conditions and the assessment of a new nitrogen (N(2MBW system. METHODS: The N(2MBW setup indirectly measures the N(2 fraction (F(N2 from main-stream carbon dioxide (CO(2 and side-stream oxygen (O(2 signals: F(N2 = 1-F(O2-F(CO2-F(Argon. For in vitro N(2MBW, a double chamber plastic lung model was filled with water, heated to 37°C, and ventilated at various lung volumes, respiratory rates, and F(CO2. In vivo N(2MBW was undertaken in triplets on two occasions in 30 healthy adults. Primary N(2MBW outcome was functional residual capacity (FRC. We assessed in vitro error (√[difference](2 between measured and model FRC (100-4174 mL, and error between tests of in vivo FRC, lung clearance index (LCI, and normalized phase III slope indices (S(acin and S(cond. RESULTS: The model generated 145 FRCs under BTPS conditions and various breathing patterns. Mean (SD error was 2.3 (1.7%. In 500 to 4174 mL FRCs, 121 (98% of FRCs were within 5%. In 100 to 400 mL FRCs, the error was better than 7%. In vivo FRC error between tests was 10.1 (8.2%. LCI was the most reproducible ventilation inhomogeneity index. CONCLUSION: The lung model generates lung volumes under the conditions encountered during clinical MBW testing and enables realistic validation of MBW systems. The new N(2MBW system reliably measures lung volumes and delivers reproducible LCI values.
Shifting mindsets: a realist synthesis of evidence from self-management support training.
Davies, Freya; Wood, Fiona; Bullock, Alison; Wallace, Carolyn; Edwards, Adrian
2018-03-01
Accompanying the growing expectation of patient self-management is the need to ensure health care professionals (HCPs) have the required attitudes and skills to provide effective self-management support (SMS). Results from existing training interventions for HCPs in SMS have been mixed and the evidence base is weaker for certain settings, including supporting people with progressive neurological conditions (PNCs). We set out to understand how training operates, and to identify barriers and facilitators to training designed to support shifts in attitudes amongst HCPs. We undertook a realist literature synthesis focused on: (i) the influence of how HCPs, teams and organisations view and adopt self-management; and (ii) how SMS needs to be tailored for people with PNCs. A traditional database search strategy was used alongside citation tracking, grey literature searching and stakeholder recommendations. We supplemented PNC-specific literature with data from other long-term conditions. Key informant interviews and stakeholder advisory group meetings informed the synthesis process. Realist context-mechanism-outcome configurations were generated and mapped onto the stages described in Mezirow's Transformative Learning Theory. Forty-four original articles were included (19 relating to PNCs), from which seven refined theories were developed. The theories identified important training elements (evidence provision, building skills and confidence, facilitating reflection and generating empathy). The significant influence of workplace factors as possible barriers or facilitators was highlighted. Embracing SMS often required challenging traditional professional role boundaries. The integration of SMS into routine care is not an automatic outcome from training. A transformative learning process is often required to trigger the necessary mindset shift. Training should focus on how individual HCPs define and value SMS and how their work context (patient group and organisational
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Statistical properties of the nuclear shell-model Hamiltonian
International Nuclear Information System (INIS)
Dias, H.; Hussein, M.S.; Oliveira, N.A. de
1986-01-01
The statistical properties of realistic nuclear shell-model Hamiltonian are investigated in sd-shell nuclei. The probability distribution of the basic-vector amplitude is calculated and compared with the Porter-Thomas distribution. Relevance of the results to the calculation of the giant resonance mixing parameter is pointed out. (Author) [pt
Ilyas, Muhammad; Salwah
2017-02-01
The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.
Schippers, P.
2009-01-01
The acoustic modelling in TNO’s ALMOST (=Acoustic Loss Model for Operational Studies and Tasks) uses a bubble migration model as realistic input for wake modelling. The modelled bubble cloud represents the actual ship wake. Ship hull, propeller and bow wave are the main generators of bubbles in the
Danish electricity supply. Statistics 2003
International Nuclear Information System (INIS)
2004-01-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2003 for consumption, prices of electric power, power generation and transmission, and trade. (ln)
Danish electricity supply. Statistics 2000
International Nuclear Information System (INIS)
2001-07-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2000 for consumption, prices of electric power; power generation and transmission, and trade. (ln)
Danish electricity supply. Statistics 2002
International Nuclear Information System (INIS)
2003-01-01
The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2002 for consumption, prices of electric power; power generation and transmission, and trade. (ln)
Parsing statistical machine translation output
Carter, S.; Monz, C.; Vetulani, Z.
2009-01-01
Despite increasing research into the use of syntax during statistical machine translation, the incorporation of syntax into language models has seen limited success. We present a study of the discriminative abilities of generative syntax-based language models, over and above standard n-gram models,
Swiss electricity statistics 2001
International Nuclear Information System (INIS)
2002-01-01
This publication by the Association of Swiss Electricity Enterprises for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity production, trading and consumption in Switzerland in 2001. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2001, the data being supplied for each hydrological year and the summer and winter seasons respectively. The production of power in Switzerland is examined in detail. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2001 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The final two chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity chapters cover new and future power generation capacities and the economic considerations involved in the supply of electricity
Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios
Energy Technology Data Exchange (ETDEWEB)
Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hale, Elaine T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgindy, Tarek [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rossol, Michael N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Lopez, Anthony J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Vergara, Claudio [MIT; Domingo, Carlos Mateo [IIT Comillas; Postigo, Fernando [IIT Comillas; de Cuadra, Fernando [IIT Comillas; Gomez, Tomas [IIT Comillas; Duenas, Pablo [MIT; Luke, Max [MIT; Li, Vivian [MIT; Vinoth, Mohan [GE Grid Solutions; Kadankodu, Sree [GE Grid Solutions
2017-08-09
The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present the goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.
Statistics in action a Canadian outlook
Lawless, Jerald F
2014-01-01
Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c
International Nuclear Information System (INIS)
2007-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2006. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2006 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
International Nuclear Information System (INIS)
2005-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2004. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2004 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
International Nuclear Information System (INIS)
2006-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2005. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2005 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
Statistical methods for forecasting
Abraham, Bovas
2009-01-01
The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...
International Nuclear Information System (INIS)
2004-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2003. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2003 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
International Nuclear Information System (INIS)
2003-01-01
This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2002. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2002 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons
Jagosh, Justin; Macaulay, Ann C; Pluye, Pierre; Salsberg, Jon; Bush, Paula L; Henderson, Jim; Sirett, Erin; Wong, Geoff; Cargo, Margaret; Herbert, Carol P; Seifer, Sarena D; Green, Lawrence W; Greenhalgh, Trisha
2012-06-01
Participatory research (PR) is the co-construction of research through partnerships between researchers and people affected by and/or responsible for action on the issues under study. Evaluating the benefits of PR is challenging for a number of reasons: the research topics, methods, and study designs are heterogeneous; the extent of collaborative involvement may vary over the duration of a project and from one project to the next; and partnership activities may generate a complex array of both short- and long-term outcomes. Our review team consisted of a collaboration among researchers and decision makers in public health, research funding, ethics review, and community-engaged scholarship. We identified, selected, and appraised a large-variety sample of primary studies describing PR partnerships, and in each stage, two team members independently reviewed and coded the literature. We used key realist review concepts (middle-range theory, demi-regularity, and context-mechanism-outcome configurations [CMO]) to analyze and synthesize the data, using the PR partnership as the main unit of analysis. From 7,167 abstracts and 591 full-text papers, we distilled for synthesis a final sample of twenty-three PR partnerships described in 276 publications. The link between process and outcome in these partnerships was best explained using the middle-range theory of partnership synergy, which demonstrates how PR can (1) ensure culturally and logistically appropriate research, (2) enhance recruitment capacity, (3) generate professional capacity and competence in stakeholder groups, (4) result in productive conflicts followed by useful negotiation, (5) increase the quality of outputs and outcomes over time, (6) increase the sustainability of project goals beyond funded time frames and during gaps in external funding, and (7) create system changes and new unanticipated projects and activities. Negative examples illustrated why these outcomes were not a guaranteed product of PR
Fundamental quantitative security in quantum key generation
International Nuclear Information System (INIS)
Yuen, Horace P.
2010-01-01
We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.
Convective aggregation in realistic convective-scale simulations
Holloway, Christopher E.
2017-06-01
To investigate the real-world relevance of idealized-model convective self-aggregation, five 15 day cases of real organized convection in the tropics are simulated. These include multiple simulations of each case to test sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. These simulations are compared to self-aggregation seen in the same model configured to run in idealized radiative-convective equilibrium. Analysis of the budget of the spatial variance of column-integrated frozen moist static energy shows that control runs have significant positive contributions to organization from radiation and negative contributions from surface fluxes and transport, similar to idealized runs once they become aggregated. Despite identical lateral boundary conditions for all experiments in each case, systematic differences in mean column water vapor (CWV), CWV distribution shape, and CWV autocorrelation length scale are found between the different sensitivity runs, particularly for those without interactive radiation, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations (although the organization of precipitation shows less sensitivity to interactive radiation). The magnitudes and signs of these systematic differences are consistent with a rough equilibrium between (1) equalization due to advection from the lateral boundaries and (2) disaggregation due to the absence of interactive radiation, implying disaggregation rates comparable to those in idealized runs with aggregated initial conditions and noninteractive radiation. This points to a plausible similarity in the way that radiation feedbacks maintain aggregated convection in both idealized simulations and the real world.Plain Language SummaryUnderstanding the processes that lead to the organization of tropical rainstorms is an important challenge for weather
Effective realistic interactions for low momentum Hilbert spaces
International Nuclear Information System (INIS)
Weber, Dennis
2012-01-01
Realistic nucleon-nucleon potentials are an essential ingredient of modern microscopic many-body calculations. These potentials can be represented in two different ways: operator representation or matrix element representation. In operator representation the potential is represented by a set of quantum mechanical operators while in matrix element representation it is defined by the matrix elements in a given basis. Many modern potentials are constructed directly in matrix element representation. While the matrix element representation can be calculated from the operator representation, the determination of the operator representation from the matrix elements is more difficult. Some methods to solve the nuclear many-body problem, such as Fermionic Molecular Dynamics (FMD) or the Green's Function Monte Carlo (GFMC) method, however require explicitly the operator representation of the potential, as they do not work in a fixed many-body basis. It is therefore desirable to derive an operator representation also for the interactions given by matrix elements. In this work a method is presented which allows the derivation of an approximate operator representation starting from the momentum space partial wave matrix elements of the interaction. For that purpose an ansatz for the operator representation is chosen. The parameters in the ansatz are determined by a fit to the partial wave matrix elements. Since a perfect reproduction of the matrix elements in general cannot be achieved with a finite number of operators and the quality of the results depends on the choice of the ansatz, the obtained operator representation is tested in nuclear many-body calculations and the results are compared with those from the initial interaction matrix elements. For the calculation of the nucleon-nucleon scattering phase shifts and the deuteron properties a computer code written within this work is used. For larger nuclei the No Core Shell Model (NCSM) and FMD are applied. The described
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Lean and leadership practices: development of an initial realist program theory.
Goodridge, Donna; Westhorp, Gill; Rotter, Thomas; Dobson, Roy; Bath, Brenna
2015-09-07
uses data effectively to identify actual and relevant local problems and the root causes of those problems; and g) creates or supports a 'learning organization' culture. This study has generated initial hypotheses and realist program theory that can form the basis for future evaluation of Lean initiatives. Developing leadership capacity and culture is theorized to be a necessary precursor to other systemic and observable changes arising from Lean initiatives.
A Data-Driven Approach to Realistic Shape Morphing
Gao, Lin; Lai, Yu-Kun; Huang, Qi-Xing; Hu, Shi-Min
2013-01-01
Morphing between 3D objects is a fundamental technique in computer graphics. Traditional methods of shape morphing focus on establishing meaningful correspondences and finding smooth interpolation between shapes. Such methods however only take geometric information as input and thus cannot in general avoid producing unnatural interpolation, in particular for large-scale deformations. This paper proposes a novel data-driven approach for shape morphing. Given a database with various models belonging to the same category, we treat them as data samples in the plausible deformation space. These models are then clustered to form local shape spaces of plausible deformations. We use a simple metric to reasonably represent the closeness between pairs of models. Given source and target models, the morphing problem is casted as a global optimization problem of finding a minimal distance path within the local shape spaces connecting these models. Under the guidance of intermediate models in the path, an extended as-rigid-as-possible interpolation is used to produce the final morphing. By exploiting the knowledge of plausible models, our approach produces realistic morphing for challenging cases as demonstrated by various examples in the paper. © 2013 The Eurographics Association and Blackwell Publishing Ltd.
Convective aggregation in idealised models and realistic equatorial cases
Holloway, Chris
2015-04-01
Idealised explicit convection simulations of the Met Office Unified Model are shown to exhibit spontaneous self-aggregation in radiative-convective equilibrium, as seen previously in other models in several recent studies. This self-aggregation is linked to feedbacks between radiation, surface fluxes, and convection, and the organization is intimately related to the evolution of the column water vapour (CWV) field. To investigate the relevance of this behaviour to the real world, these idealized simulations are compared with five 15-day cases of real organized convection in the tropics, including multiple simulations of each case testing sensitivities of the convective organization and mean states to interactive radiation, interactive surface fluxes, and evaporation of rain. Despite similar large-scale forcing via lateral boundary conditions, systematic differences in mean CWV, CWV distribution shape, and the length scale of CWV features are found between the different sensitivity runs, showing that there are at least some similarities in sensitivities to these feedbacks in both idealized and realistic simulations.
From Delivery to Adoption of Physical Activity Guidelines: Realist Synthesis
Directory of Open Access Journals (Sweden)
Liliana Leone
2017-10-01
Full Text Available Background: Evidence-based guidelines published by health authorities for the promotion of health-enhancing physical activity (PA, continue to be implemented unsuccessfully and demonstrate a gap between evidence and policies. This review synthesizes evidence on factors influencing delivery, adoption and implementation of PA promotion guidelines within different policy sectors (e.g., health, transport, urban planning, sport, education. Methods: Published literature was initially searched using PubMed, EBSCO, Google Scholar and continued through an iterative snowball technique. The literature review spanned the period 2002–2017. The realist synthesis approach was adopted to review the content of 39 included studies. An initial programme theory with a four-step chain from evidence emersion to implementation of guidelines was tested. Results: The synthesis furthers our understanding of the link between PA guidelines delivery and the actions of professionals responsible for implementation within health services, school departments and municipalities. The main mechanisms identified for guidance implementation were scientific legitimation, enforcement, feasibility, familiarity with concepts and PA habits. Threats emerged to the successful implementation of PA guidelines at national/local jurisdictional levels. Conclusions: The way PA guidelines are developed may influence their adoption by policy-makers and professionals. Useful lessons emerged that may inform synergies between policymaking and professional practices, promoting win-win multisectoral strategies.
Factors influencing intercultural doctor-patient communication: a realist review.
Paternotte, Emma; van Dulmen, Sandra; van der Lee, Nadine; Scherpbier, Albert J J A; Scheele, Fedde
2015-04-01
Due to migration, doctors see patients from different ethnic backgrounds. This causes challenges for the communication. To develop training programs for doctors in intercultural communication (ICC), it is important to know which barriers and facilitators determine the quality of ICC. This study aimed to provide an overview of the literature and to explore how ICC works. A systematic search was performed to find literature published before October 2012. The search terms used were cultural, communication, healthcare worker. A realist synthesis allowed us to use an explanatory focus to understand the interplay of communication. In total, 145 articles met the inclusion criteria. We found ICC challenges due to language, cultural and social differences, and doctors' assumptions. The mechanisms were described as factors influencing the process of ICC and divided into objectives, core skills and specific skills. The results were synthesized in a framework for the development of training. The quality of ICC is influenced by the context and by the mechanisms. These mechanisms translate into practical points for training, which seem to have similarities with patient-centered communication. Training for improving ICC can be developed as an extension of the existing training for patient-centered communication. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Use of clinical guidelines in remote Australia: A realist evaluation.
Reddy, Sandeep; Orpin, Victoria; Herring, Sally; Mackie-Schneider, Stephanie; Struber, Janet
2018-02-01
The aim of this evaluation was to assess the acceptability, accessibility, and compliance with the 2014 editions of the Remote Primary Health Care Manuals (RPHCM) in health care centres across remote areas of Northern and Central Australia. To undertake a comprehensive evaluation that considered context, the evaluation used a realist evaluation framework. The evaluation used a variety of methods including interviews and survey to develop and test a programme theory. Many remote health practitioners have adopted standardized, evidence-based practice because of the use of the RPHCM. The mechanisms that led to the use of the manuals include acceptance of the worth of the protocols to their clinical practice, reliance on manual content to guide their practice, the perception of credibility, the applicability of RPHCM content to the context, and a fear of the consequences of not using the RPHCMs. Some remote health practitioners are less inclined to use the RPHCM regularly because of a perception that the content is less suited to their needs and daily practice or it is hard to navigate or understand. The evaluation concluded that there is work to be done to widen the RPHCM user base, and organizations need to increase support for their staff to use the RPHCM protocols better. These measures are expected to enable standardized clinical practice in the remote context. © 2017 John Wiley & Sons, Ltd.
Atomistic simulations of graphite etching at realistic time scales.
Aussems, D U B; Bal, K M; Morgan, T W; van de Sanden, M C M; Neyts, E C
2017-10-01
Hydrogen-graphite interactions are relevant to a wide variety of applications, ranging from astrophysics to fusion devices and nano-electronics. In order to shed light on these interactions, atomistic simulation using Molecular Dynamics (MD) has been shown to be an invaluable tool. It suffers, however, from severe time-scale limitations. In this work we apply the recently developed Collective Variable-Driven Hyperdynamics (CVHD) method to hydrogen etching of graphite for varying inter-impact times up to a realistic value of 1 ms, which corresponds to a flux of ∼10 20 m -2 s -1 . The results show that the erosion yield, hydrogen surface coverage and species distribution are significantly affected by the time between impacts. This can be explained by the higher probability of C-C bond breaking due to the prolonged exposure to thermal stress and the subsequent transition from ion- to thermal-induced etching. This latter regime of thermal-induced etching - chemical erosion - is here accessed for the first time using atomistic simulations. In conclusion, this study demonstrates that accounting for long time-scales significantly affects ion bombardment simulations and should not be neglected in a wide range of conditions, in contrast to what is typically assumed.