WorldWideScience

Sample records for source montage analysis

  1. Analysis of infant cortical synchrony is constrained by the number of recording electrodes and the recording montage.

    Science.gov (United States)

    Tokariev, Anton; Vanhatalo, Sampsa; Palva, J Matias

    2016-01-01

    To assess how the recording montage in the neonatal EEG influences the detection of cortical source signals and their phase interactions. Scalp EEG was simulated by forward modeling 20-200 simultaneously active sources covering the cortical surface of a realistic neonatal head model. We assessed systematically how the number of scalp electrodes (11-85), analysis montage, or the size of cortical sources affect the detection of cortical phase synchrony. Statistical metrics were developed for quantifying the resolution and reliability of the montages. The findings converge to show that an increase in the number of recording electrodes leads to a systematic improvement in the detection of true cortical phase synchrony. While there is always a ceiling effect with respect to discernible cortical details, we show that the average and Laplacian montages exhibit superior specificity and sensitivity as compared to other conventional montages. Reliability in assessing true neonatal cortical synchrony is directly related to the choice of EEG recording and analysis configurations. Because of the high conductivity of the neonatal skull, the conventional neonatal EEG recordings are spatially far too sparse for pertinent studies, and this loss of information cannot be recovered by re-montaging during analysis. Future neonatal EEG studies will need prospective planning of recording configuration to allow analysis of spatial details required by each study question. Our findings also advice about the level of details in brain synchrony that can be studied with existing datasets or by using conventional EEG recordings. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Design as Montage

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Kjærsgaard, Mette

    The paper explores the role of video and (visual) anthropology in design and product development processes from creating insights about users to developing design ideas in interdisciplinary teams. In the paper we suggest montage as a metaphor for understanding how meaning and ideas are created...... in design processes. Rather than viewing montage as a particular style of filmmaking, we see the design process itself as a montage, a process where different heterogeneous materials – sketches, video, prototypes, etc. – as well as different professional and disciplinary perspectives are brought together...

  3. Model and Montage

    DEFF Research Database (Denmark)

    Meldgaard, Morten

    2012-01-01

    Bidraget søger at sammenkæde forskellige modelpraksis med montage teoriens tænkning. De anvendte cases er dels lavet på laserskærer og dels udført i skala 1:1, begge af studerende ved Kunstakademiets Arkitektskole. Dette empiriske materiale møder en mere teoretisk funderet reflektion, i artiklen ...... som diskuterer hvad en model er, og hvilket forhold der er mellem en analog og en digital praksis....

  4. Montage and Image as Paradigm

    Directory of Open Access Journals (Sweden)

    Cesar Huapaya

    2016-01-01

    Full Text Available Thought as montage and image has become a revealing method in the practical and theoretical study processes of artists and researchers of the 20th and 21st centuries. This article aims to articulate three ways of thinking through montage in the works of Bertolt Brecht, Sergei Eisenstein e Georges Didi-Huberman. The philosopher and art historian Georges Didi-Huberman re-inaugurates the debate and exercise of thinking the anthropology of image and montage as a metalanguage and a form of knowledge.

  5. 32-Channel banana-avg montage is better than 16-channel double banana montage to detect epileptiform discharges in routine EEGs.

    Science.gov (United States)

    Ochoa, Juan; Gonzalez, Walter; Bautista, Ramon; DeCerce, John

    2008-10-01

    We designed a study, comparing the yield of standard 16-channel longitudinal bipolar montage (double banana) versus a combined 32-channel longitudinal bipolar plus average referential montage (banana-plus), to detect epileptic abnormalities. We selected 25 consecutive routine EEG samples with a diagnosis of spike or sharp waves in the temporal regions and 25 consecutive focal slowing and 50 normal EEGs. A total of 100 samples were printed in both montages and randomized for reading. Thirty independent EEG readers blinded from the EEG diagnosis were invited to participate. Twenty-two readers successfully completed the test for a total of 4400 answers collected for analysis. The average sensitivity to detect epileptiform discharges for 16 and 32-channel montages was 36.5% and 61%, respectively (Pdouble banana montage. Residents and EEG fellows could improve EEG-reading accuracy if taught on a combined 32-channel montage.

  6. TOASTing Your Images With Montage

    Science.gov (United States)

    Berriman, G. Bruce; Good, John

    2017-01-01

    The Montage image mosaic engine is a scalable toolkit for creating science-grade mosaics of FITS files, according to the user's specifications of coordinates, projection, sampling, and image rotation. It is written in ANSI-C and runs on all common *nix-based platforms. The code is freely available and is released with a BSD 3-clause license. Version 5 is a major upgrade to Montage, and provides support for creating images that can be consumed by the World Wide Telescope (WWT). Montage treats the TOAST sky tessellation scheme, used by the WWT, as a spherical projection like those in the WCStools library. Thus images in any projection can be converted to the TOAST projection by Montage’s reprojection services. These reprojections can be performed at scale on high-performance platforms and on desktops. WWT consumes PNG or JPEG files, organized according to WWT’s tiling and naming scheme. Montage therefore provides a set of dedicated modules to create the required files from FITS images that contain the TOAST projection. There are two other major features of Version 5. It supports processing of HEALPix files to any projection in the WCS tools library. And it can be built as a library that can be called from other languages, primarily Python. http://montage.ipac.caltech.edu.GitHub download page: https://github.com/Caltech-IPAC/Montage.ASCL record: ascl:1010.036. DOI: dx.doi.org/10.5281/zenodo.49418 Montage is funded by the National Science Foundation under Grant Number ACI-1440620,

  7. Montage Version 3.0

    Science.gov (United States)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  8. Illness, everyday life and narrative montage

    DEFF Research Database (Denmark)

    Henriksen, Nina; Tjørnhøj-Thomsen, Tine; Hansen, Helle Ploug

    2011-01-01

    -created by the reader. It points to the effect of the aesthetics of disguise and carnival implicit in the visual-verbal montage and argues that these generate a third meaning. This meaning is associated with the breast cancer experience but is not directly discernible in the montage. The article concludes by discussing...

  9. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  10. SEP Montage Variability Comparison during Intraoperative Neurophysiologic Monitoring.

    Science.gov (United States)

    Hanson, Christine; Lolis, Athena Maria; Beric, Aleksandar

    2016-01-01

    Intraoperative monitoring is performed to provide real-time assessment of the neural structures that can be at risk during spinal surgery. Somatosensory evoked potentials (SEPs) are the most commonly used modality for intraoperative monitoring. SEP stability can be affected by many factors during the surgery. This study is a prospective review of SEP recordings obtained during intraoperative monitoring of instrumented spinal surgeries that were performed for chronic underlying neurologic and neuromuscular conditions, such as scoliosis, myelopathy, and spinal stenosis. We analyzed multiple montages at the baseline, and then followed their development throughout the procedure. Our intention was to examine the stability of the SEP recordings throughout the surgical procedure on multiple montages of cortical SEP recordings, with the goal of identifying the appropriate combination of the least number of montages that gives the highest yield of monitorable surgeries. Our study shows that it is necessary to have multiple montages for SEP recordings, as it reduces the number of non-monitorable cases, improves IOM reliability, and therefore could reduce false positives warnings to the surgeons. Out of all the typical montages available for use, our study has shown that the recording montage Cz-C4/Cz-C3 (Cz-Cc) is the most reliable and stable throughout the procedure and should be the preferred montage followed throughout the surgery.

  11. Mystery Montage: A Holistic, Visual, and Kinesthetic Process for Expanding Horizons and Revealing the Core of a Teaching Philosophy

    Science.gov (United States)

    Ennis, Kim; Priebe, Carly; Sharipova, Mayya; West, Kim

    2012-01-01

    Revealing the core of a teaching philosophy is the key to a concise and meaningful philosophy statement, but it can be an elusive goal. This paper offers a visual, kinesthetic, and holistic process for expanding the horizons of self-reflection, self-analysis, and self-knowledge. Mystery montage, a variation of visual mapping, storyboarding, and…

  12. Montage: Improvising in the Land of Action Research

    Science.gov (United States)

    Windle, Sheila; Sefton, Terry

    2011-01-01

    This paper and its appended multi-media production describe the rationale and process of creating and presenting a "digitally saturated" (Lankshear & Knobel, 2003), multi-layered, synchronous "montage" (Denzin & Lincoln, 2003) of educational Action Research findings. The authors contend that this type of presentation, arising from the fusion of…

  13. Transcranial direct current stimulation in obsessive-compulsive disorder: emerging clinical evidence and considerations for optimal montage of electrodes.

    Science.gov (United States)

    Senço, Natasha M; Huang, Yu; D'Urso, Giordano; Parra, Lucas C; Bikson, Marom; Mantovani, Antonio; Shavitt, Roseli G; Hoexter, Marcelo Q; Miguel, Eurípedes C; Brunoni, André R

    2015-07-01

    Neuromodulation techniques for obsessive-compulsive disorder (OCD) treatment have expanded with greater understanding of the brain circuits involved. Transcranial direct current stimulation (tDCS) might be a potential new treatment for OCD, although the optimal montage is unclear. To perform a systematic review on meta-analyses of repetitive transcranianal magnetic stimulation (rTMS) and deep brain stimulation (DBS) trials for OCD, aiming to identify brain stimulation targets for future tDCS trials and to support the empirical evidence with computer head modeling analysis. Systematic reviews of rTMS and DBS trials on OCD in Pubmed/MEDLINE were searched. For the tDCS computational analysis, we employed head models with the goal of optimally targeting current delivery to structures of interest. Only three references matched our eligibility criteria. We simulated four different electrodes montages and analyzed current direction and intensity. Although DBS, rTMS and tDCS are not directly comparable and our theoretical model, based on DBS and rTMS targets, needs empirical validation, we found that the tDCS montage with the cathode over the pre-supplementary motor area and extra-cephalic anode seems to activate most of the areas related to OCD.

  14. The Next Generation of the Montage Image Mopsaic Engine

    Science.gov (United States)

    Berriman, G. Bruce; Good, John; Rusholme, Ben; Robitaille, Thomas

    2016-01-01

    We have released a major upgrade of the Montage image mosaic engine (http://montage.ipac.caltech.edu) , as part of a program to develop the next generation of the engine in response to the rapid changes in the data processing landscape in Astronomy, which is generating ever larger data sets in ever more complex formats . The new release (version 4) contains modules dedicated to creating and managing mosaics of data stored as multi-dimensional arrays ("data cubes"). The new release inherits the architectural benefits of portability and scalability of the original design. The code is publicly available on Git Hub and the Montage web page. The release includes a command line tool that supports visualization of large images, and the beta-release of a Python interface to the visualization tool. We will provide examples on how to use these these features. We are generating a mosaic of the Galactic Arecibo L-band Feed Array HI (GALFA-HI) Survey maps of neutral hydrogen in and around our Milky Way Galaxy, to assess the performance at scale and to develop tools and methodologies that will enable scientists inexpert in cloud processing to exploit could platforms for data processing and product generation at scale. Future releases include support for an R-tree based mechanism for fast discovery of and access to large data sets and on-demand access to calibrated SDSS DR9 data that exploits it; support for the Hierarchical Equal Area isoLatitude Pixelization (HEALPix) scheme, now standard for projects investigating cosmic background radiation (Gorski et al 2005); support fort the Tessellated Octahedral Adaptive Subdivision Transform (TOAST), the sky partitioning sky used by the WorldWide Telescope (WWT); and a public applications programming interface (API) in C that can be called from other languages, especially Python.

  15. Pharmaceutical structure montages as catalysts for design and discovery.

    Science.gov (United States)

    Njarðarson, Jon T

    2012-05-01

    Majority of pharmaceuticals are small molecule organic compounds. Their structures are most effectively described and communicated using the graphical language of organic chemistry. A few years ago we decided to harness this powerful language to create new educational tools that could serve well for data mining and as catalysts for discovery. The results were the Top 200 drug posters, which we have posted online for everyone to enjoy and update yearly. This article details the origin and motivation for our design and highlights the value of this graphical format by presenting and analyzing a new pharmaceutical structure montage (poster) focused on US FDA approved drugs in 2011.

  16. Model-based analysis and optimization of the mapping of cortical sources in the spontaneous scalp EEG

    NARCIS (Netherlands)

    Sazonov, A.; Bergmans, J.W.M.; Cluitmans, P.J.M.; Griep, P.A.M.; Arends, J.B.A.M.; Boon, P.A.J.M.

    2007-01-01

    The mapping of brain sources into the scalp electroencephalogram (EEG) depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM) is fully determined by an observation function (OF) matrix. This paper analyses the

  17. Spatial Montage and Multimedia Ethnography: Using Computers to Visualise Aspects of Migration and Social Division Among a Displaced Community

    Directory of Open Access Journals (Sweden)

    Judith Aston

    2010-05-01

    Full Text Available This paper discusses how computer-based techniques of spatial montage can be used to visualise aspects of migration and social division among a displaced community. It is based on an ongoing collaboration between the author and the anthropologist, Wendy JAMES. The work is based on a substantial archive of ethnographic photographs, audio, cine and video recordings collected by JAMES in the Sudan/Ethiopian borderlands over four decades. Initially recording the way of life of several minority peoples, she was later able to follow their fortunes during the repeated war displacements and separations they suffered from the 1980s onwards. The recordings document work rhythms, dance, song and storytelling, music and other sensory rich performances alongside spoken memories of past events. The research is developing spatial montage techniques to draw comparisons across time, between multiple points of view, and between recordings of events and spoken memories of these events. It is argued that these techniques can be used to facilitate direct engagement with ethnographic recordings, creating multimedia experiences which can flexibly integrate fieldwork data into academic discourse. In so doing it is proposed that these techniques offer new tools to enhance the analysis and understanding of issues relating to migration and social division. URN: urn:nbn:de:0114-fqs1002361

  18. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  19. Open ends: an ethnographic radio montage about post-diagnosis lives in Denmark and South Africa

    DEFF Research Database (Denmark)

    Houmøller, Kathrin; Steno, Anne Mia

    This presentation takes the form of a radio montage and presents stories of post-diagnosis lives in Denmark and urban South Africa. Based on ethnographic fieldworks with young people in psychiatric treatment (Denmark) and among hiv-positive people in anti-retroviral therapy (South Africa), the mo......This presentation takes the form of a radio montage and presents stories of post-diagnosis lives in Denmark and urban South Africa. Based on ethnographic fieldworks with young people in psychiatric treatment (Denmark) and among hiv-positive people in anti-retroviral therapy (South Africa...

  20. How do reference montage and electrodes setup affect the measured scalp EEG potentials?

    Science.gov (United States)

    Hu, Shiang; Lai, Yongxiu; Valdes-Sosa, Pedro A.; Bringas-Vega, Maria L.; Yao, Dezhong

    2018-04-01

    Objective. Human scalp electroencephalogram (EEG) is widely applied in cognitive neuroscience and clinical studies due to its non-invasiveness and ultra-high time resolution. However, the representativeness of the measured EEG potentials for the underneath neural activities is still a problem under debate. This study aims to investigate systematically how both reference montage and electrodes setup affect the accuracy of EEG potentials. Approach. First, the standard EEG potentials are generated by the forward calculation with a single dipole in the neural source space, for eleven channel numbers (10, 16, 21, 32, 64, 85, 96, 128, 129, 257, 335). Here, the reference is the ideal infinity implicitly determined by forward theory. Then, the standard EEG potentials are transformed to recordings with different references including five mono-polar references (Left earlobe, Fz, Pz, Oz, Cz), and three re-references (linked mastoids (LM), average reference (AR) and reference electrode standardization technique (REST)). Finally, the relative errors between the standard EEG potentials and the transformed ones are evaluated in terms of channel number, scalp regions, electrodes layout, dipole source position and orientation, as well as sensor noise and head model. Main results. Mono-polar reference recordings are usually of large distortions; thus, a re-reference after online mono-polar recording should be adopted in general to mitigate this effect. Among the three re-references, REST is generally superior to AR for all factors compared, and LM performs worst. REST is insensitive to head model perturbation. AR is subject to electrodes coverage and dipole orientation but no close relation with channel number. Significance. These results indicate that REST would be the first choice of re-reference and AR may be an alternative option for high level sensor noise case. Our findings may provide the helpful suggestions on how to obtain the EEG potentials as accurately as possible for

  1. Montage, Militancy, Metaphysics: Chris Marker and André Bazin

    Directory of Open Access Journals (Sweden)

    Sarah Cooper

    2010-01-01

    Full Text Available

     

    Abstract (E: This article focuses on the relationship between the work of André Bazin and Chris Marker from the late 1940s through to the late 1950s and beyond. The division between Bazin's ŘRight Bankř affiliation with Les Cahiers du Cinéma on the one hand, and Markerřs ŘLeft Bankř allegiances on the other, is called into question here as my argument seeks to muddy the waters of their conventional ideological separation across the river Seine. Working alliteratively through Markerřs well-known talent for deft montage along with his militancy, I consider Bazinřs praise for Markerřs editing technique Ŕ in spite of famously expressing a preference elsewhere for the long take, and deep focus cinematography Ŕ and I address their political differences and convergences. Yet I also explore the rather more unexpected question of metaphysics in order to further emphasize a closer relationship between these two figures. I chart the emergence of an enduring spiritual bond between critic and filmmaker that surfaces first in Markerřs writings for the left-wing Catholic journal L’EspritProducing an Infrared Multiwavelength Galactic Plane Atlas Using Montage, Pegasus, and Amazon Web Services

    Science.gov (United States)

    Rynge, M.; Juve, G.; Kinney, J.; Good, J.; Berriman, B.; Merrihew, A.; Deelman, E.

    2014-05-01

    In this paper, we describe how to leverage cloud resources to generate large-scale mosaics of the galactic plane in multiple wavelengths. Our goal is to generate a 16-wavelength infrared Atlas of the Galactic Plane at a common spatial sampling of 1 arcsec, processed so that they appear to have been measured with a single instrument. This will be achieved by using the Montage image mosaic engine process observations from the 2MASS, GLIMPSE, MIPSGAL, MSX and WISE datasets, over a wavelength range of 1 μm to 24 μm, and by using the Pegasus Workflow Management System for managing the workload. When complete, the Atlas will be made available to the community as a data product. We are generating images that cover ±180° in Galactic longitude and ±20° in Galactic latitude, to the extent permitted by the spatial coverage of each dataset. Each image will be 5°x5° in size (including an overlap of 1° with neighboring tiles), resulting in an atlas of 1,001 images. The final size will be about 50 TBs. This paper will focus on the computational challenges, solutions, and lessons learned in producing the Atlas. To manage the computation we are using the Pegasus Workflow Management System, a mature, highly fault-tolerant system now in release 4.2.2 that has found wide applicability across many science disciplines. A scientific workflow describes the dependencies between the tasks and in most cases the workflow is described as a directed acyclic graph, where the nodes are tasks and the edges denote the task dependencies. A defining property for a scientific workflow is that it manages data flow between tasks. Applied to the galactic plane project, each 5 by 5 mosaic is a Pegasus workflow. Pegasus is used to fetch the source images, execute the image mosaicking steps of Montage, and store the final outputs in a storage system. As these workflows are very I/O intensive, care has to be taken when choosing what infrastructure to execute the workflow on. In our setup, we choose

  2. Radial Peripapillary Capillary Network Visualized Using Wide-Field Montage Optical Coherence Tomography Angiography.

    Science.gov (United States)

    Mase, Tomoko; Ishibazawa, Akihiro; Nagaoka, Taiji; Yokota, Harumasa; Yoshida, Akitoshi

    2016-07-01

    We quantitatively analyzed the features of a radial peripapillary capillary (RPC) network visualized using wide-field montage optical coherence tomography (OCT) angiography in healthy human eyes. Twenty eyes of 20 healthy subjects were recruited. En face 3 × 3-mm OCT angiograms of multiple locations in the posterior pole were acquired using the RTVue XR Avanti, and wide-field montage images of the RPC were created. To evaluate the RPC density, the montage images were binarized and skeletonized. The correlation between the RPC density and the retinal nerve fiber layer (RNFL) thickness measured by an OCT circle scan was investigated. The RPC at the temporal retina was detected as far as 7.6 ± 0.7 mm from the edge of the optic disc but not around the perifoveal area within 0.9 ± 0.1 mm of the fovea. Capillary-free zones beside the first branches of the arterioles were significantly (P optic disc edge were 13.6 ± 0.8, 11.9 ± 0.9, and 10.4 ± 0.9 mm-1. The RPC density also was correlated significantly (r = 0.64, P network. The RPC is present in the superficial peripapillary retina in proportion to the RNFL thickness, supporting the idea that the RPC may be the vascular network primarily responsible for RNFL nourishment.

  3. Use of Computational Modeling to Inform tDCS Electrode Montages for the Promotion of Language Recovery in Post-stroke Aphasia.

    Science.gov (United States)

    Galletta, Elizabeth E; Cancelli, Andrea; Cottone, Carlo; Simonelli, Ilaria; Tecchio, Franca; Bikson, Marom; Marangolo, Paola

    2015-01-01

    Although pilot trials of transcranial direct current stimulation (tDCS) in aphasia are encouraging, protocol optimization is needed. Notably, it has not yet been clarified which of the varied electrode montages investigated is the most effective in enhancing language recovery. To consider and contrast the predicted brain current flow patterns (electric field distribution) produced by varied 1×1 tDCS (1 anode, 1 cathode, 5 × 7 cm pad electrodes) montages used in aphasia clinical trials. A finite element model of the head of a single left frontal stroke patient was developed in order to study the pattern of the cortical EF magnitude and inward/outward radial EF under five different electrode montages: Anodal-tDCS (A-tDCS) over the left Wernicke's area (Montage A) and over the left Broca's area (Montage B); Cathodal tDCS (C-tDCS) over the right homologue of Wernicke's area (Montage C), and of Broca's area (Montage D), where for all montages A-D the "return" electrode was placed over the supraorbital contralateral forehead; bilateral stimulation with A-tDCS over the left Broca's and CtDCS over the right Broca's homologue (Montage E). In all cases, the "return" electrode over the contralesional supraorbital forehead was not inert and influenced the current path through the entire brain. Montage B, although similar to montage D in focusing the current in the perilesional area, exerted the greatest effect over the left perilesional cortex, which was even stronger in montage E. The position and influence of both electrodes must be considered in the design and interpretation of tDCS clinical trials for aphasia. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Soundwalking: Deep Listening and Spatio-Temporal Montage

    Directory of Open Access Journals (Sweden)

    Andrew Brown

    2017-08-01

    Full Text Available The bicentenary of the 1817 Pentrich Revolution provided an opportunity for the composition of a series of soundwalks that, in turn, offer themselves up as a case study in an exposition of spatial bricolage, from the perspective of an interdisciplinary artist working with the medium of locative sound. Informed by Doreen Massey’s definition of space as ‘a simultaneity of stories so far’, the author’s approach involves extracting sounds from the contemporary soundscape and re-introducing them in the form of multi-layered compositions. This article conducts an analysis of the author’s soundwalking practice according to Max van Manen’s formulation of four essential categories of experience through which to consider our ‘lived world’: spatiality, temporality, corporeality, and relationality. Drawing upon theorists whose concerns include cinematic, mobile and environmental sound, such as Chion, Chambers and Schafer, the author proposes the soundwalk as as an expanded form of cinema, with the flexibility to provoke states of immersion was well as critical detachment. A case is made for the application of the medium within the artistic investigation into ecological and socio-political issues alongside aesthetic concerns.

  5. Can Film Show the Invisible? The Work of Montage in Ethnographic Filmmaking

    DEFF Research Database (Denmark)

    Suhr, Christian; Willerslev, Rane

    2012-01-01

    This article suggests that film can evoke hidden dimensions of ethnographic reality, not by striving for ever more realistic depictions – a position often associated with observational cinema – but rather by exploiting the artificial means through which human vision can be transcended. Achieved...... particularly through the use of montage, such disruptions can multiply the perspectives from which filmic subject matter is perceived, thus conveying its invisible and irreducible otherness. This, however, is an argument not for dismissing the realism of much ethnographic filmmaking, but rather to demonstrate...

  6. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  7. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  8. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  9. (58 Indices, Metaphors and Montages. The Heterogeneous Work in Current Latin American Literary Studies

    Directory of Open Access Journals (Sweden)

    Francisco Gelman Constantin

    2017-09-01

    Full Text Available As contemporary literary scholars challenge the ruling exclusionary criteria for the homogenization of their objects, while at the same time the biopolitical turn on literary theory criticizes representational understandings of the bond between language and the body, this paper suggests to address said relationship with recourse to the Lacanian notion of the ‘montage of heterogeneous’, which was brought forth toward a redefinition of the psychoanalytical concept of drive. Drawing from the notion of ‘heterogeneous literatures’, I advocate a theoretical genealogy from Bataille to Lacan (while Nancy, Foucault and Butler are also summoned to the discussion in order to come to terms with the rethinking of the objects for literary scholarship demanded by works such as Emilio García Wehbi’s performance piece 58 indicios sobre el cuerpo, along with his and Nora Lezano’s poetical- photographical essay Communitas.

  10. Evaluation of a Modified High-Definition Electrode Montage for Transcranial Alternating Current Stimulation (tACS) of Pre-Central Areas

    DEFF Research Database (Denmark)

    Heise, Kirstin Friederike; Kortzorg, Nick; Saturnino, Guilherme Bicalho

    2016-01-01

    Objective: To evaluate a modified electrode montage with respect to its effect on tACS-dependent modulation of corticospinal excitability and discomfort caused by neurosensory side effects accompanying stimulation. Methods: In a double-blind cross-over design, the classical electrode montage for ....... Conclusions: In comparison to the classic montage, the M1 centre-ring montage enables a more focal stimulation of the target area and, at the same time, significantly reduces neurosensory side effects, essential for placebo-controlled study designs.......Objective: To evaluate a modified electrode montage with respect to its effect on tACS-dependent modulation of corticospinal excitability and discomfort caused by neurosensory side effects accompanying stimulation. Methods: In a double-blind cross-over design, the classical electrode montage...... for primary motor cortex (M1) stimulation (two patch electrodes over M1 and contralateral supraorbital area) was compared with an M1 centre-ring montage. Corticospinal excitability was evaluated before, during, immediately after and 15 minutes after tACS (10 min., 20 Hz vs. 30 s low-frequency transcranial...

  11. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  12. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  13. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  14. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  15. Naked, Deformed, Violated Body. A Montage in the Histoire(s du cinema of Jean-Luc Godard

    Directory of Open Access Journals (Sweden)

    Alberto Brodesco

    2013-07-01

    Full Text Available The article analyses Histoire(s du cinéma (1988-1998, a cinematic essay by Jean-Luc Godard, and in particular it focuses on the controversial montage in which the French director aligns extracts from a pornographic film, Tod Browning’s Freaks, and footage from the concentration camps. With this sequence Godard inquires his own theory of montage: the idea of a productive reconciliation between opposing realities. This shocking sequence (the violence of images is compared to a similar shock (the violence of asking to witness produced by a scene of the documentary Shoah by Claude Lanzmann. The trauma of Godard’s editing choice induces the viewer to examine the issues of the degradation of the indexical status of the film, the limits of representation and the ethics of the gaze.

  16. Auditory mismatch negativity in schizophrenia: topographic evaluation with a high-density recording montage.

    Science.gov (United States)

    Hirayasu, Y; Potts, G F; O'Donnell, B F; Kwon, J S; Arakaki, H; Akdag, S J; Levitt, J J; Shenton, M E; McCarley, R W

    1998-09-01

    The mismatch negativity, a negative component in the auditory event-related potential, is thought to index automatic processes involved in sensory or echoic memory. The authors' goal in this study was to examine the topography of auditory mismatch negativity in schizophrenia with a high-density, 64-channel recording montage. Mismatch negativity topography was evaluated in 23 right-handed male patients with schizophrenia who were receiving medication and in 23 nonschizophrenic comparison subjects who were matched in age, handedness, and parental socioeconomic status. The Positive and Negative Syndrome Scale was used to measure psychiatric symptoms. Mismatch negativity amplitude was reduced in the patients with schizophrenia. They showed a greater left-less-than-right asymmetry than comparison subjects at homotopic electrode pairs near the parietotemporal junction. There were correlations between mismatch negativity amplitude and hallucinations at left frontal electrodes and between mismatch negativity amplitude and passive-apathetic social withdrawal at left and right frontal electrodes. Mismatch negativity was reduced in schizophrenia, especially in the left hemisphere. This finding is consistent with abnormalities of primary or adjacent auditory cortex involved in auditory sensory or echoic memory.

  17. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  18. Aussteigen (getting out Impossible—Montage and Life Scenarios in Andres Veiel’s Film Black Box BRD

    Directory of Open Access Journals (Sweden)

    Anja Katharina Seiler

    2016-02-01

    Full Text Available Andres Veiel’s 2001 documentary film, Black Box BRD, links the biography of Alfred Herrhausen, RAF victim, with one of the 3rdgeneration RAF terrorists, Wolfgang Grams. In my paper, I trace how the film’s aesthetics introduce an image montage of two life scenarios by establishing both parallels and contrast, and therefore, following Susan Haywards definition “creates a third meaning” (112. I examine how the film establishes an aesthetic concept of Aussteigen (getting out—along of the alive, visible bodies—the contemporary interviewees, and dead, invisible bodies—of Herrhausen and Grams.

  19. Soprano and source: A laryngographic analysis

    Science.gov (United States)

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  1. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  2. Antioxidants: Characterization, natural sources, extraction and analysis

    OpenAIRE

    OROIAN, MIRCEA; Escriche Roberto, Mª Isabel

    2015-01-01

    [EN] Recently many review papers regarding antioxidants fromdifferent sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, fo...

  3. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  4. Ion sources for solids isotopic analysis

    International Nuclear Information System (INIS)

    Tyrrell, A.C.

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material. (Auth.)

  5. Analysis of Contract Source Selection Strategy

    Science.gov (United States)

    2015-07-07

    accomplish this milestone due to his unconditional love. I would like to thank my mom, Saraswati, and my dad , Khilendra, for their support and patience...FOR FURTHER RESEARCH The task of understanding the impact of a source selection strategy on resultant contract outcomes is a topic rich for further

  6. Ion sources for solids isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tyrrell, A. C. [Ministry of Defence, Foulness (UK). Atomic Weapons Research Establishment

    1978-12-15

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material.

  7. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  8. Analysis of primary teacher stress' sources

    Directory of Open Access Journals (Sweden)

    Katja Depolli Steiner

    2011-12-01

    Full Text Available Teachers are subject to many different work stressors. This study focused on differences in intensity and frequency of potential stressors facing primary schoolteachers and set the goal to identify the most important sources of teacher stress in primary school. The study included 242 primary schoolteachers from different parts of Slovenia. We used Stress Inventory that is designed for identification of intensity and frequency of 49 situations that can play the role of teachers' work stressors. Findings showed that the major sources of stress facing teachers are factors related to work overload, factors stemming from pupils' behaviour and motivation and factors related to school system. Results also showed some small differences in perception of stressors in different groups of teachers (by gender and by teaching level.

  9. Antioxidants: Characterization, natural sources, extraction and analysis.

    Science.gov (United States)

    Oroian, Mircea; Escriche, Isabel

    2015-08-01

    Recently many review papers regarding antioxidants from different sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, for example preventing cancer and cardiovascular diseases, and lowering the incidence of different diseases. In this paper the main classes of antioxidants are presented: vitamins, carotenoids and polyphenols. Recently, many analytical methodologies involving diverse instrumental techniques have been developed for the extraction, separation, identification and quantification of these compounds. Antioxidants have been quantified by different researchers using one or more of these methods: in vivo, in vitro, electrochemical, chemiluminescent, electron spin resonance, chromatography, capillary electrophoresis, nuclear magnetic resonance, near infrared spectroscopy and mass spectrometry methods. Copyright © 2015. Published by Elsevier Ltd.

  10. Supercontinuum light sources for food analysis

    DEFF Research Database (Denmark)

    Møller, Uffe Visbech; Petersen, Christian Rosenberg; Kubat, Irnis

    2014-01-01

    . One track of Light & Food will target the mid-infrared spectral region. To date, the limitations of mid-infraredlight sources, such as thermal emitters, low-power laser diodes, quantum cascade lasers and synchrotron radiation, have precluded mid-IR applications where the spatial coherence, broad...... bandwidth,high brightness and portability of a supercontinuum laser are all required. DTU Fotonik has now demonstrated the first optical fiber based broadband supercontinuum light souce, which covers 1.4-13.3μm and thereby most of the molecular fingerprint region....

  11. An Analysis of Programming Beginners' Source Programs

    Science.gov (United States)

    Matsuyama, Chieko; Nakashima, Toyoshiro; Ishii, Naohiro

    The production of animations was made the subject of a university programming course in order to make students understand the process of program creation, and so that students could tackle programming with interest. In this paper, the formats and composition of the programs which students produced were investigated. As a result, it was found that there were a lot of problems related to such matters as how to use indent, how to apply comments and functions etc. for the format and the composition of the source codes.

  12. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  13. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  14. Model-Based Analysis and Optimization of the Mapping of Cortical Sources in the Spontaneous Scalp EEG

    Directory of Open Access Journals (Sweden)

    Andrei V. Sazonov

    2007-01-01

    Full Text Available The mapping of brain sources into the scalp electroencephalogram (EEG depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM is fully determined by an observation function (OF matrix. This paper analyses the OF-matrix for a generation model for the desynchronized spontaneous EEG. The model involves a four-shell spherical volume conductor containing dipolar sources that are mutually uncorrelated so as to reflect the desynchronized EEG. The reference is optimized in order to minimize the impact in the SM of the sources located distant from the electrodes. The resulting reference is called the localized reference (LR. The OF-matrix is analyzed in terms of the relative power contribution of the sources and the cross-channel correlation coefficient for five existing references as well as for the LR. It is found that the Hjorth Laplacian reference is a fair approximation of the LR, and thus is close to optimum for practical intents and purposes. The other references have a significantly poorer performance. Furthermore, the OF-matrix is analyzed for limits to the spatial resolution for the EEG. These are estimated to be around 2 cm.

  15. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  16. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  17. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  18. Discours en circulation et (démontage filmique dans Fahrenheit 9/11 [Circulating discourse and film editing in Fahrenheit 9/11

    Directory of Open Access Journals (Sweden)

    Andrea Landvogt

    2010-12-01

    Full Text Available Le film documentaire est une pratique sociale discursive médiatique qui se fonde essentiellement sur des pratiques citationnelles allant de l'allusion vague à la citation exacte. L'étude se concentre surun effet qui résulte de la rhéthorique filmique caractéristique de Michael Moore: la mise en circulation de discours –verbaux, visuels et acoustiques– décontextualisés grâce aux techniques du montage. Dans la mesure où il s'agit, dans la plupart des cas, d'un montagediscordant, Moore parvient à transgresser les normes du genre «documentaire» pour arriver à la docu-satire.Documentaries can be seen as a social practice of filmic discourse. They are essentially based on citation techniques ranging from vague allusions to exact reproductions. The present study emphasizes a characteristic effect of Michael Moore's film rhethoric which consists in the use of montage techniques in order make –verbal, visual and acoustic– discourses circulate. However, Moore's excessiveuse of discordant montage is eventually overcoming the standards of documentary film as genre. It leads to something new we would like to call docu-satire.

  19. Risk analysis of alternative energy sources

    International Nuclear Information System (INIS)

    Kazmer, D.R.

    1982-01-01

    The author explores two points raised by Miller Spangler in a January 1981 issue: public perception of risks involving nuclear power plants relative to those of conventional plants and criteria for evaluating the way risk analyses are made. On the first point, he concludes that translating public attitudes into the experts' language of probability and risk could provide better information and understanding of both the attitudes and the risks. Viewing risk analysis methodologies as filters which help to test historical change, he suggests that the lack of information favors a lay jury approach for energy decisions. Spangler responds that Congress is an example of lay decision making, but that a lay jury, given public disinterest and polarization, would probably not improve social justice on the nuclear issue. 5 references, 4 figures

  20. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  1. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  2. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  3. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  4. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  5. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  6. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  7. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  8. Dosimetric analysis of radiation sources to use in dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  9. Discours en circulation et (dé)montage filmique dans Fahrenheit 9/11 [Circulating discourse and film editing in Fahrenheit 9/11

    OpenAIRE

    Andrea Landvogt; Kathrin Sartingen

    2010-01-01

    Le film documentaire est une pratique sociale discursive médiatique qui se fonde essentiellement sur des pratiques citationnelles allant de l'allusion vague à la citation exacte. L'étude se concentre surun effet qui résulte de la rhéthorique filmique caractéristique de Michael Moore: la mise en circulation de discours –verbaux, visuels et acoustiques– décontextualisés grâce aux techniques du montage. Dans la mesure où il s'agit, dans la plupart des cas, d'un montagediscordant, Moore parvient ...

  10. Cognition through montage and mechanisms of individual memory in Bogusław Bachorczyk's art on the example of the artist's apartment-studio

    Directory of Open Access Journals (Sweden)

    Antos, Janusz

    2014-12-01

    Full Text Available The present text discusses Bogusław Bachorczyk's apartment-studio in Krakow. The decorations he has been making there since 2003 have transformed into a kind of work-in-progress. These decorations, just like Bachorczyk's art, are related to the issues of memory and identity. In 2013 he started the transformation of his apartment by "lacing up the wall" with polychrome in the library room, later to embrace also other rooms. He installed into existing polychromes new elements according to the rule of montage, which has recently constituted the basic strategy of his work.

  11. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  12. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  13. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  14. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  15. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Science.gov (United States)

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  16. Tectonics of montage

    DEFF Research Database (Denmark)

    Bundgaard, Charlotte

    2013-01-01

    We build in accordance with specific contemporary conditions, defined by production methods, construction and materials as well as ethics, meaning and values. Exactly this relationship between the work as such and the conditions behind its coming into being is a crucial point. The simultaneity of...

  17. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  18. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  19. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  20. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  1. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    Science.gov (United States)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  2. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  3. Analysis of the Structure Ratios of the Funding Sources

    Directory of Open Access Journals (Sweden)

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  4. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  5. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  6. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  7. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  8. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  9. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  10. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  11. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  12. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  13. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  14. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  15. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    International Nuclear Information System (INIS)

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  16. Analysis of the TMI-2 source range detector response

    International Nuclear Information System (INIS)

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  17. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  18. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  19. Creep analysis of fuel plates for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  20. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  1. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  2. Review on solving the inverse problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  3. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  4. Thermal hydraulic analysis of the encapsulated nuclear heat source

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J.; Wade, D.C. [Argonne National Lab., IL (United States)

    2001-07-01

    An analysis has been carried out of the steady state thermal hydraulic performance of the Encapsulated Nuclear Heat Source (ENHS) 125 MWt, heavy liquid metal coolant (HLMC) reactor concept at nominal operating power and shutdown decay heat levels. The analysis includes the development and application of correlation-type analytical solutions based upon first principles modeling of the ENHS concept that encompass both pure as well as gas injection augmented natural circulation conditions, and primary-to-intermediate coolant heat transfer. The results indicate that natural circulation of the primary coolant is effective in removing heat from the core and transferring it to the intermediate coolant without the attainment of excessive coolant temperatures. (authors)

  5. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  6. Application of Open Source Technologies for Oceanographic Data Analysis

    Science.gov (United States)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  7. Prospects for accelerator neutron sources for large volume minerals analysis

    International Nuclear Information System (INIS)

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  8. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  9. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  10. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  11. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  12. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  13. Phase 2 safety analysis report: National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stefan, P.

    1989-06-01

    The Phase II program was established in order to provide additional space for experiments, and also staging and equipment storage areas. It also provides additional office space and new types of advanced instrumentation for users. This document will deal with the new safety issues resulting from this extensive expansion program, and should be used as a supplement to BNL Report No. 51584 ''National Synchrotron Light Source Safety Analysis Report,'' July 1982 (hereafter referred to as the Phase I SAR). The initial NSLS facility is described in the Phase I SAR. It comprises two electron storage rings, an injection system common to both, experimental beam lines and equipment, and office and support areas, all of which are housed in a 74,000 sq. ft. building. The X-ray Ring provides for 28 primary beam ports and the VUV Ring, 16. Each port is capable of division into 2 or 3 separate beam lines. All ports receive their synchrotron light from conventional bending magnet sources, the magnets being part of the storage ring lattice. 4 refs

  14. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  15. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  16. Statistical Analysis of the Microvariable AGN Source Mrk 501

    Directory of Open Access Journals (Sweden)

    Alberto C. Sadun

    2018-02-01

    Full Text Available We report on the optical observations and analysis of the high-energy peaked BL Lac object (HBL, Mrk 501, at redshift z = 0.033. We can confirm microvariable behavior over the course of minutes on several occasions per night. As an alternative to the commonly understood dynamical model of random variations in intensity of the AGN, we develop a relativistic beaming model with a minimum of free parameters, which allows us to infer changes in the line of sight angles for the motion of the different relativistic components. We hope our methods can be used in future studies of beamed emission in other active microvariable sources, similar to the one we explored.

  17. Evaluating source separation of plastic waste using conjoint analysis.

    Science.gov (United States)

    Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke

    2008-11-01

    Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues.

  18. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  19. Surface-Source Downhole Seismic Analysis in R

    Science.gov (United States)

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  20. Fire Hazard Analysis for the Cold Neutron Source System

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-15

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area.

  1. Fire Hazard Analysis for the Cold Neutron Source System

    International Nuclear Information System (INIS)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-01

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area

  2. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  3. Comparative Analysis Study of Open Source GIS in Malaysia

    International Nuclear Information System (INIS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Halim, Mohd Khuizham Abd

    2014-01-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options

  4. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  5. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  6. Source analysis of spaceborne microwave radiometer interference over land

    Science.gov (United States)

    Guan, Li; Zhang, Sibo

    2016-03-01

    Satellite microwave thermal emissions mixed with signals from active sensors are referred to as radiofrequency interference (RFI). Based on Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) observations from June 1 to 16, 2011, RFI over Europe was identified and analyzed using the modified principal component analysis algorithm in this paper. The X band AMSR-E measurements in England and Italy are mostly affected by the stable, persistent, active microwave transmitters on the surface, while the RFI source of other European countries is the interference of the reflected geostationary TV satellite downlink signals to the measurements of spaceborne microwave radiometers. The locations and intensities of the RFI induced by the geostationary TV and communication satellites changed with time within the observed period. The observations of spaceborne microwave radiometers in ascending portions of orbits are usually interfered with over European land, while no RFI was detected in descending passes. The RFI locations and intensities from the reflection of downlink radiation are highly dependent upon the relative geometry between the geostationary satellite and the measuring passive sensor. Only these fields of view of a spaceborne instrument whose scan azimuths are close to the azimuth relative to the geostationary satellite are likely to be affected by RFI.

  7. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    A major challenge facing the prospective deployment of radiation detection systems for homeland security applications is the discrimination of radiological or nuclear 'threat sources' from radioactive, but benign, 'nuisance sources'. Common examples of such nuisance sources include naturally occurring radioactive material (NORM), medical patients who have received radioactive drugs for either diagnostics or treatment, and industrial sources. A sensitive detector that cannot distinguish between 'threat' and 'benign' classes will generate false positives which, if sufficiently frequent, will preclude it from being operationally deployed. In this report, we describe a first-principles physics-based modeling approach that is used to approximate the physical properties and corresponding gamma ray spectral signatures of real nuisance sources. Specific models are proposed for the three nuisance source classes - NORM, medical and industrial. The models can be validated against measured data - that is, energy spectra generated with the model can be compared to actual nuisance source data. We show by example how this is done for NORM and medical sources, using data sets obtained from spectroscopic detector deployments for cargo container screening and urban area traffic screening, respectively. In addition to capturing the range of radioactive signatures of individual nuisance sources, a nuisance source population model must generate sources with a frequency of occurrence consistent with that found in actual movement of goods and people. Measured radiation detection data can indicate these frequencies, but, at present, such data are available only for a very limited set of locations and time periods. In this report, we make more general estimates of frequencies for NORM and medical sources using a range of data sources such as shipping manifests and medical treatment statistics. We also identify potential data sources for industrial

  8. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  9. Preliminary thermal analysis of grids for twin source extraction system

    International Nuclear Information System (INIS)

    Pandey, Ravi; Bandyopadhyay, Mainak; Chakraborty, Arun K.

    2017-01-01

    The TWIN (Two driver based Indigenously built Negative ion source) source provides a bridge between the operational single driver based negative ion source test facility, ROBIN in IPR and an ITER-type multi driver based ion source. The source is designed to be operated in CW mode with 180kW, 1MHz, 5s ON/600s OFF duty cycle and also in 5Hz modulation mode with 3s ON/20s OFF duty cycle for 3 such cycle. TWIN source comprises of ion source sub-assembly (consist of driver and plasma box) and extraction system sub-assembly. Extraction system consists of Plasma grid (PG), extraction grid (EG) and Ground grid (GG) sub assembly. Negative ion beams produced at plasma grid seeing the plasma side of ion source will receive moderate heat flux whereas the extraction grid and ground grid would be receiving majority of heat flux from extracted negative ion and co-extracted electron beams. Entire Co-extracted electron beam would be dumped at extraction grid via electron deflection magnetic field making the requirement of thermal and hydraulic design for extraction grid to be critical. All the three grids are made of OFHC Copper and would be actively water cooled keeping the peak temperature rise of grid surface within allowable limit with optimum uniformity. All the grids are to be made by vacuum brazing process where joint strength becomes crucial at elevated temperature. Hydraulic design must maintain the peak temperature at the brazing joint within acceptable limit

  10. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Directory of Open Access Journals (Sweden)

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  11. Seismicity and source spectra analysis in Salton Sea Geothermal Field

    Science.gov (United States)

    Cheng, Y.; Chen, X.

    2016-12-01

    The surge of "man-made" earthquakes in recent years has led to considerable concerns about the associated hazards. Improved monitoring of small earthquakes would significantly help understand such phenomena and the underlying physical mechanisms. In the Salton Sea Geothermal field in southern California, open access of a local borehole network provides a unique opportunity to better understand the seismicity characteristics, the related earthquake hazards, and the relationship with the geothermal system, tectonic faulting and other physical conditions. We obtain high-resolution earthquake locations in the Salton Sea Geothermal Field, analyze characteristics of spatiotemporal isolated earthquake clusters, magnitude-frequency distributions and spatial variation of stress drops. The analysis reveals spatial coherent distributions of different types of clustering, b-value distributions, and stress drop distribution. The mixture type clusters (short-duration rapid bursts with high aftershock productivity) are predominately located within active geothermal field that correlate with high b-value, low stress drop microearthquake clouds, while regular aftershock sequences and swarms are distributed throughout the study area. The differences between earthquakes inside and outside of geothermal operation field suggest a possible way to distinguish directly induced seismicity due to energy operation versus typical seismic slip driven sequences. The spatial coherent b-value distribution enables in-situ estimation of probabilities for M≥3 earthquakes, and shows that the high large-magnitude-event (LME) probability zones with high stress drop are likely associated with tectonic faulting. The high stress drop in shallow (1-3 km) depth indicates the existence of active faults, while low stress drops near injection wells likely corresponds to the seismic response to fluid injection. I interpret the spatial variation of seismicity and source characteristics as the result of fluid

  12. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  13. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  14. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  15. Regional Moment Tensor Source-Type Discrimination Analysis

    Science.gov (United States)

    2015-11-16

    unique normalized eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson...Solutions color-coded by variance reduction (VR) pre- sented on the Tape and Tape (2012a) and Tape and Tape (2012b) Lune . The white circle...eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson source-type plot (Hudson

  16. Visualizing spikes in source-space

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Duez, Lene; Scherg, Michael

    2016-01-01

    OBJECTIVE: Reviewing magnetoencephalography (MEG) recordings is time-consuming: signals from the 306 MEG-sensors are typically reviewed divided into six arrays of 51 sensors each, thus browsing each recording six times in order to evaluate all signals. A novel method of reconstructing the MEG...... signals in source-space was developed using a source-montage of 29 brain-regions and two spatial components to remove magnetocardiographic (MKG) artefacts. Our objective was to evaluate the accuracy of reviewing MEG in source-space. METHODS: In 60 consecutive patients with epilepsy, we prospectively...... evaluated the accuracy of reviewing the MEG signals in source-space as compared to the classical method of reviewing them in sensor-space. RESULTS: All 46 spike-clusters identified in sensor-space were also identified in source-space. Two additional spike-clusters were identified in source-space. As 29...

  17. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  18. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  19. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  20. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  1. Renewable energy sources cost benefit analysis and prospects for Italy

    International Nuclear Information System (INIS)

    Ariemma, A.; Montanino, G.

    1992-01-01

    In light of Italy's over-dependency on imported oil, and due to this nation's commitment to the pursuit of the strict environmental protection policies of the European Communities, ENEL (the Italian National Electricity Board) has become actively involved in research efforts aimed at the commercialization of renewable energy sources - photovoltaic, wind, biomass, and mini-hydraulic. Through the use of energy production cost estimates based on current and near- future levels of technological advancement, this paper assesses prospects for the different sources. The advantages and disadvantages of each source in its use as a suitable complementary energy supply satisfying specific sets of constraints regarding siting, weather, capital and operating costs, maintenance, etc., are pointed out. In comparing the various alternatives, the paper also considers environmental benefits and commercialization feasibility in terms of time and outlay

  2. HFIR cold neutron source moderator vessel design analysis

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-04-01

    A cold neutron source capsule made of aluminum alloy is to be installed and located at the tip of one of the neutron beam tubes of the High Flux Isotope Reactor. Cold hydrogen liquid of temperature approximately 20 degree Kelvin and 15 bars pressure is designed to flow through the aluminum capsule that serves to chill and to moderate the incoming neutrons produced from the reactor core. The cold and low energy neutrons thus produced will be used as cold neutron sources for the diffraction experiments. The structural design calculation for the aluminum capsule is reported in this paper

  3. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  4. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  5. Reactor Core Design and Analysis for a Micronuclear Power Source

    Directory of Open Access Journals (Sweden)

    Hao Sun

    2018-03-01

    Full Text Available Underwater vehicle is designed to ensure the security of country sea boundary, providing harsh requirements for its power system design. Conventional power sources, such as battery and Stirling engine, are featured with low power and short lifetime. Micronuclear reactor power source featured with higher power density and longer lifetime would strongly meet the demands of unmanned underwater vehicle power system. In this paper, a 2.4 MWt lithium heat pipe cooled reactor core is designed for micronuclear power source, which can be applied for underwater vehicles. The core features with small volume, high power density, long lifetime, and low noise level. Uranium nitride fuel with 70% enrichment and lithium heat pipes are adopted in the core. The reactivity is controlled by six control drums with B4C neutron absorber. Monte Carlo code MCNP is used for calculating the power distribution, characteristics of reactivity feedback, and core criticality safety. A code MCORE coupling MCNP and ORIGEN is used to analyze the burnup characteristics of the designed core. The results show that the core life is 14 years, and the core parameters satisfy the safety requirements. This work provides reference to the design and application of the micronuclear power source.

  6. Fecal bacteria source characterization and sensitivity analysis of SWAT 2005

    Science.gov (United States)

    The Soil and Water Assessment Tool (SWAT) version 2005 includes a microbial sub-model to simulate fecal bacteria transport at the watershed scale. The objectives of this study were to demonstrate methods to characterize fecal coliform bacteria (FCB) source loads and to assess the model sensitivity t...

  7. Source term analysis for a RCRA mixed waste disposal facility

    International Nuclear Information System (INIS)

    Jordan, D.L.; Blandford, T.N.; MacKinnon, R.J.

    1996-01-01

    A Monte Carlo transport scheme was used to estimate the source strength resulting from potential releases from a mixed waste disposal facility. Infiltration rates were estimated using the HELP code, and transport through the facility was modeled using the DUST code, linked to a Monte Carlo driver

  8. Stability analysis of direct current control in current source rectifier

    DEFF Research Database (Denmark)

    Lu, Dapeng; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    Current source rectifier with high switching frequency has a great potential for improving the power efficiency and power density in ac-dc power conversion. This paper analyzes the stability of direct current control based on the time delay effect. Small signal model including dynamic behaviors...

  9. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  10. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    NARCIS (Netherlands)

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  11. Vrancea seismic source analysis using a small-aperture array

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Radulian, M.; Placinta, A.O.

    2005-01-01

    A small-aperture seismic array (BURAR) was installed in 1999 in the northern part of the Romanian territory (Bucovina area). Since then, the array has been in operation under a joint cooperation programme between Romania and USA. The array consists of 10 stations installed in boreholes (nine short period instruments and one broadband instrument) with enough high sensitivity to properly detect earthquakes generated in Vrancea subcrustal domain (at about 250 km epicentral distance) with magnitude M w below 3. Our main purpose is to investigate and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using specific techniques provided by the BURAR array data. Forty earthquakes with magnitudes between 2.9 and 6.0 were selected, including the recent events of September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7), October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w = 6.0) and May 14, 2005 (45.66 angle N, 26.52 angle E, h = 146 km, M w = 5.1), which are the best ever recorded earthquakes on the Romanian territory: Empirical Green's function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Our study shows the capability of the BURAR array to determine the source parameters of the Vrancea intermediate-depth earthquakes as a stand alone station and proves that the recordings of this array alone provides reliable and useful tools to efficiently constrain the source parameters and consequently source scaling properties. (authors)

  12. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Science.gov (United States)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  13. Detecting Large-Scale Brain Networks Using EEG: Impact of Electrode Density, Head Modeling and Source Localization

    Science.gov (United States)

    Liu, Quanying; Ganzetti, Marco; Wenderoth, Nicole; Mantini, Dante

    2018-01-01

    Resting state networks (RSNs) in the human brain were recently detected using high-density electroencephalography (hdEEG). This was done by using an advanced analysis workflow to estimate neural signals in the cortex and to assess functional connectivity (FC) between distant cortical regions. FC analyses were conducted either using temporal (tICA) or spatial independent component analysis (sICA). Notably, EEG-RSNs obtained with sICA were very similar to RSNs retrieved with sICA from functional magnetic resonance imaging data. It still remains to be clarified, however, what technological aspects of hdEEG acquisition and analysis primarily influence this correspondence. Here we examined to what extent the detection of EEG-RSN maps by sICA depends on the electrode density, the accuracy of the head model, and the source localization algorithm employed. Our analyses revealed that the collection of EEG data using a high-density montage is crucial for RSN detection by sICA, but also the use of appropriate methods for head modeling and source localization have a substantial effect on RSN reconstruction. Overall, our results confirm the potential of hdEEG for mapping the functional architecture of the human brain, and highlight at the same time the interplay between acquisition technology and innovative solutions in data analysis. PMID:29551969

  14. Car indoor air pollution - analysis of potential sources

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-12-01

    Full Text Available Abstract The population of industrialized countries such as the United States or of countries from the European Union spends approximately more than one hour each day in vehicles. In this respect, numerous studies have so far addressed outdoor air pollution that arises from traffic. By contrast, only little is known about indoor air quality in vehicles and influences by non-vehicle sources. Therefore the present article aims to summarize recent studies that address i.e. particulate matter exposure. It can be stated that although there is a large amount of data present for outdoor air pollution, research in the area of indoor air quality in vehicles is still limited. Especially, knowledge on non-vehicular sources is missing. In this respect, an understanding of the effects and interactions of i.e. tobacco smoke under realistic automobile conditions should be achieved in future.

  15. Sources of political violence, political and psychological analysis

    Directory of Open Access Journals (Sweden)

    O. B. Balatska

    2015-05-01

    We also consider the following approaches to determining the nature and sources of aggression and violence such as instinktyvizm (K. Lorenz and behaviorism (J. B. Watson and B. F. Skinner et al.. Special attention is paid to theories of frustration aggression (J. Dollard, N. E. Miller, L. Berkowitz et al., according to which the causes of aggression and violence are hidden in a particular mental state – frustration. The particular importance of the theory of T. R. Gurr, in which the source of aggression and political violence are defined through the concept of relative deprivation, is underlined. Another approach is described in the article ­ the concept of aggression as a learned reaction (A. Bandura, G. Levin, B. Fleischmann et al.. Supporters of this approach believe that aggressive behavior is formed in the process of social training.

  16. The Spallation Neutron Source (SNS) conceptual design shielding analysis

    International Nuclear Information System (INIS)

    Johnson, J.O.; Odano, N.; Lillie, R.A.

    1998-03-01

    The shielding design is important for the construction of an intense high-energy accelerator facility like the proposed Spallation Neutron Source (SNS) due to its impact on conventional facility design, maintenance operations, and since the cost for the radiation shielding shares a considerable part of the total facility costs. A calculational strategy utilizing coupled high energy Monte Carlo calculations and multi-dimensional discrete ordinates calculations, along with semi-empirical calculations, was implemented to perform the conceptual design shielding assessment of the proposed SNS. Biological shields have been designed and assessed for the proton beam transport system and associated beam dumps, the target station, and the target service cell and general remote maintenance cell. Shielding requirements have been assessed with respect to weight, space, and dose-rate constraints for operating, shutdown, and accident conditions. A discussion of the proposed facility design, conceptual design shielding requirements calculational strategy, source terms, preliminary results and conclusions, and recommendations for additional analyses are presented

  17. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  18. Economic analysis of the need for advanced power sources

    International Nuclear Information System (INIS)

    Hardie, R.W.; Omberg, R.P.

    1975-01-01

    The purpose of this paper is to determine the economic need for an advanced power source, be it fusion, solar, or some other concept. However, calculations were also performed assuming abandonment of the LMFBR program, so breeder benefits are a by-product of this study. The model used was the ALPS linear programming system for forecasting optimum power growth patterns. Total power costs were calculated over a planning horizon from 1975 to 2041 and discounted at 7 1 / 2 percent. The benefit of a particular advanced power source is simply the reduction in total power cost resulting from its introduction. Since data concerning advanced power sources (APS) are speculative, parametric calculations varying introduction dates and capital costs about a hypothetical APS plant were performed. Calculations were also performed without the LMFBR to determine the effect of the breeder on the benefits of an advanced power source. Other data used in the study, such as the energy demand curve and uranium resource estimates, are given in the Appendix, and a list of the 11 power plants used in this study is given. Calculations were performed for APS introduction dates of 2001 and 2011. Estimates of APS capital costs included cases where it was assumed the costs were $50/kW and $25/kW higher than the LMFBR. In addition, cases where APS and LMFBR capital costs are identical were also considered. It is noted that the APS capital costs used in this study are not estimates of potential advanced power system plant costs, but were chosen to compute potential dollar benefits of advanced power systems under extremely optimistic assumptions. As a further example, all APS fuel cycle costs were assumed to be zero

  19. National Synchrotron Light Source safety-analysis report

    International Nuclear Information System (INIS)

    Batchelor, K.

    1982-07-01

    This document covers all of the safety issues relating to the design and operation of the storage rings and injection system of the National Synchrotron Light Source. The building systems for fire protection, access and egress are described together with air and other gaseous control or venting systems. Details of shielding against prompt bremstrahlung radiation and synchrotron radiation are described and the administrative requirements to be satisfied for operation of a beam line at the facility are given

  20. Analysis of polymer foil heaters as infrared radiation sources

    International Nuclear Information System (INIS)

    Witek, Krzysztof; Piotrowski, Tadeusz; Skwarek, Agata

    2012-01-01

    Infrared radiation as a heat source is used in many fields. In particular, the positive effect of far-infrared radiation on living organisms has been observed. This paper presents two technological solutions for infrared heater production using polymer-silver and polymer-carbon pastes screenprinted on foil substrates. The purpose of this work was the identification of polymer layers as a specific frequency range IR radiation sources. The characterization of the heaters was determined mainly by measurement of the surface temperature distribution using a thermovision camera and the spectral characteristics were determined using a special measuring system. Basic parameters obtained for both, polymer silver and polymer carbon heaters were similar and were as follows: power rating of 10–12 W/dm 2 , continuous working surface temperature of 80–90 °C, temperature coefficient of resistance (TCR) about +900 ppm/K for polymer-carbon heater and about +2000 ppm/K for polymer-silver, maximum radiation intensity in the wavelength range of 6–14 μm with top intensity at 8.5 μm and heating time about 20 min. For comparison purposes, commercial panel heater was tested. The results show that the characteristics of infrared polymer heaters are similar to the characteristics of the commercial heater, so they can be taken into consideration as the alternative infrared radiation sources.

  1. Analysis of Extended Z-source Inverter for Photovoltaic System

    Science.gov (United States)

    Prakash, G.; Subramani, C.; Dhineshkumar, K.; Rayavel, P.

    2018-04-01

    The Z-source inverter has picked up prominence as a solitary stage buck-support inverter topology among numerous specialists. Notwithstanding, its boosting capacity could be constrained, and in this manner, it may not be reasonable for a few applications requiring high lift request of falling other dc-dc help converters. The Z-source inverter is a recent converter topology that exhibits both voltage-buck and voltage-boost capability This could lose the effectiveness and request all the more detecting for controlling the additional new stages. This paper is proposing another group of broadened help semi Z - source inverter (ZSI) to fill the exploration hole left in the improvement of ZSI. These new topologies can be worked with same regulation strategies that were produced for unique ZSI. Likewise, they have a similar number of dynamic switches as unique ZSI saving the single-organize nature of ZSI. Proposed topologies are dissected in the enduring state and their exhibitions are approved utilizing recreated comes about acquired in MATLAB/Simulink. Besides, they are tentatively approved with comes about acquired from a model created in the research facility. The trend of fast increase of the PV energy use is related to the increasing efficiency of solar cells as well as the improvements of manufacturing technology of solar panels.

  2. Finite element analysis of advanced neutron source fuel plates

    International Nuclear Information System (INIS)

    Luttrell, C.R.

    1995-08-01

    The proposed design for the Advanced Neutron Source reactor core consists of closely spaced involute fuel plates. Coolant flows between the plates at high velocities. It is vital that adjacent plates do not come in contact and that the coolant channels between the plates remain open. Several scenarios that could result in problems with the fuel plates are studied. Finite element analyses are performed on fuel plates under pressure from the coolant flowing between the plates at a high velocity, under pressure because of a partial flow blockage in one of the channels, and with different temperature profiles

  3. Review on solving the forward problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Vergult Anneleen

    2007-11-01

    Full Text Available Abstract Background The aim of electroencephalogram (EEG source localization is to find the brain areas responsible for EEG waves of interest. It consists of solving forward and inverse problems. The forward problem is solved by starting from a given electrical source and calculating the potentials at the electrodes. These evaluations are necessary to solve the inverse problem which is defined as finding brain sources which are responsible for the measured potentials at the EEG electrodes. Methods While other reviews give an extensive summary of the both forward and inverse problem, this review article focuses on different aspects of solving the forward problem and it is intended for newcomers in this research field. Results It starts with focusing on the generators of the EEG: the post-synaptic potentials in the apical dendrites of pyramidal neurons. These cells generate an extracellular current which can be modeled by Poisson's differential equation, and Neumann and Dirichlet boundary conditions. The compartments in which these currents flow can be anisotropic (e.g. skull and white matter. In a three-shell spherical head model an analytical expression exists to solve the forward problem. During the last two decades researchers have tried to solve Poisson's equation in a realistically shaped head model obtained from 3D medical images, which requires numerical methods. The following methods are compared with each other: the boundary element method (BEM, the finite element method (FEM and the finite difference method (FDM. In the last two methods anisotropic conducting compartments can conveniently be introduced. Then the focus will be set on the use of reciprocity in EEG source localization. It is introduced to speed up the forward calculations which are here performed for each electrode position rather than for each dipole position. Solving Poisson's equation utilizing FEM and FDM corresponds to solving a large sparse linear system. Iterative

  4. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    International Nuclear Information System (INIS)

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  5. The analysis of security cost for different energy sources

    International Nuclear Information System (INIS)

    Jun, Eunju; Kim, Wonjoon; Chang, Soon Heung

    2009-01-01

    Global concerns for the security of energy have steadily been on the increase and are expected to become a major issue over the next few decades. Urgent policy response is thus essential. However, little attempt has been made at defining both energy security and energy metrics. In this study, we provide such metrics and apply them to four major energy sources in the Korean electricity market: coal, oil, liquefied natural gas, and nuclear. In our approach, we measure the cost of energy security in terms of supply disruption and price volatility, and we consider the degree of concentration in energy supply and demand using the Hirschman-Herfindahl index (HHI). Due to its balanced fuel supply and demand, relatively stable price, and high abundance, we find nuclear energy to be the most competitive energy source in terms of energy security in the Korean electricity market. LNG, on the other hand, was found to have the highest cost in term of energy security due to its high concentration in supply and demand, and its high price volatility. In addition, in terms of cost, we find that economic security dominates supply security, and as such, it is the main factor in the total security cost. Within the confines of concern for global energy security, our study both broadens our understanding of energy security and enables a strategic approach in the portfolio management of energy consumption.

  6. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  7. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  8. Study on analysis from sources of error for Airborne LIDAR

    Science.gov (United States)

    Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.

    2016-11-01

    With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.

  9. Development of a hydrogen analysis using a small neutron source

    International Nuclear Information System (INIS)

    Ishikawa, I.; Tachikawa, N.; Tominaga, H.

    1998-01-01

    Most of industrial nuclear gauges are based on the use of radiation transmission through matter. This document presents new techniques to measure hydrogen using a small neutron source. A new technique has been developed for measuring the thickness of a thin layer of 30-200 μm thick plastic, which is sandwiched between two sheets of 0.6-4.2 mm in total thickness. Another technique allows to monitor residual moisture in wet refractory newly coated on the inner surface of a steel vessel from its outside through a thick steel plate. For saving on the use of coke and for strict control of furnace heating in the iron making process a new type moisture gauge was developed using simultaneous measurement of transmission rates of both fast neutrons and gamma rays from 252 Cf

  10. Dissolution And Analysis Of Yellowcake Components For Fingerprinting UOC Sources

    International Nuclear Information System (INIS)

    Hexel, Cole R.; Bostick, Debra A.; Kennedy, Angel K.; Begovich, John M.; Carter, Joel A.

    2012-01-01

    There are a number of chemical and physical parameters that might be used to help elucidate the ore body from which uranium ore concentrate (UOC) was derived. It is the variation in the concentration and isotopic composition of these components that can provide information as to the identity of the ore body from which the UOC was mined and the type of subsequent processing that has been undertaken. Oak Ridge National Laboratory (ORNL) in collaboration with Lawrence Livermore and Los Alamos National Laboratories is surveying ore characteristics of yellowcake samples from known geologic origin. The data sets are being incorporated into a national database to help in sourcing interdicted material, as well as aid in safeguards and nonproliferation activities. Geologic age and attributes from chemical processing are site-specific. Isotopic abundances of lead, neodymium, and strontium provide insight into the provenance of geologic location of ore material. Variations in lead isotopes are due to the radioactive decay of uranium in the ore. Likewise, neodymium isotopic abundances are skewed due to the radiogenic decay of samarium. Rubidium decay similarly alters the isotopic signature of strontium isotopic composition in ores. This paper will discuss the chemical processing of yellowcake performed at ORNL. Variations in lead, neodymium, and strontium isotopic abundances are being analyzed in UOC from two geologic sources. Chemical separation and instrumental protocols will be summarized. The data will be correlated with chemical signatures (such as elemental composition, uranium, carbon, and nitrogen isotopic content) to demonstrate the utility of principal component and cluster analyses to aid in the determination of UOC provenance.

  11. Analysis of fuel management in the KIPT neutron source facility

    Energy Technology Data Exchange (ETDEWEB)

    Zhong Zhaopeng, E-mail: zzhong@anl.gov [Nuclear Engineering Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Gohar, Yousry; Talamo, Alberto [Nuclear Engineering Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States)

    2011-05-15

    Research highlights: > Fuel management of KIPT ADS was analyzed. > Core arrangement was shuffled in stage wise. > New fuel assemblies was added into core periodically. > Beryllium reflector could also be utilized to increase the fuel life. - Abstract: Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an experimental neutron source facility consisting of an electron accelerator driven sub-critical assembly. The neutron source driving the sub-critical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The sub-critical assembly surrounding the target is fueled with low enriched WWR-M2 type hexagonal fuel assemblies. The U-235 enrichment of the fuel material is <20%. The facility will be utilized for basic and applied research, producing medical isotopes, and training young specialists. With the 100 KW electron beam power, the total thermal power of the facility is {approx}360 kW including the fission power of {approx}260 kW. The burnup of the fissile materials and the buildup of fission products continuously reduce the system reactivity during the operation, decrease the neutron flux level, and consequently impact the facility performance. To preserve the neutron flux level during the operation, the fuel assemblies should be added and shuffled for compensating the lost reactivity caused by burnup. Beryllium reflector could also be utilized to increase the fuel life time in the sub-critical core. This paper studies the fuel cycles and shuffling schemes of the fuel assemblies of the sub-critical assembly to preserve the system reactivity and the neutron flux level during the operation.

  12. Analysis of fuel management in the KIPT neutron source facility

    International Nuclear Information System (INIS)

    Zhong Zhaopeng; Gohar, Yousry; Talamo, Alberto

    2011-01-01

    Research highlights: → Fuel management of KIPT ADS was analyzed. → Core arrangement was shuffled in stage wise. → New fuel assemblies was added into core periodically. → Beryllium reflector could also be utilized to increase the fuel life. - Abstract: Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an experimental neutron source facility consisting of an electron accelerator driven sub-critical assembly. The neutron source driving the sub-critical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The sub-critical assembly surrounding the target is fueled with low enriched WWR-M2 type hexagonal fuel assemblies. The U-235 enrichment of the fuel material is <20%. The facility will be utilized for basic and applied research, producing medical isotopes, and training young specialists. With the 100 KW electron beam power, the total thermal power of the facility is ∼360 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products continuously reduce the system reactivity during the operation, decrease the neutron flux level, and consequently impact the facility performance. To preserve the neutron flux level during the operation, the fuel assemblies should be added and shuffled for compensating the lost reactivity caused by burnup. Beryllium reflector could also be utilized to increase the fuel life time in the sub-critical core. This paper studies the fuel cycles and shuffling schemes of the fuel assemblies of the sub-critical assembly to preserve the system reactivity and the neutron flux level during the operation.

  13. Imaging spectroscopic analysis at the Advanced Light Source

    International Nuclear Information System (INIS)

    MacDowell, A. A.; Warwick, T.; Anders, S.; Lamble, G.M.; Martin, M.C.; McKinney, W.R.; Padmore, H.A.

    1999-01-01

    One of the major advances at the high brightness third generation synchrotrons is the dramatic improvement of imaging capability. There is a large multi-disciplinary effort underway at the ALS to develop imaging X-ray, UV and Infra-red spectroscopic analysis on a spatial scale from. a few microns to 10nm. These developments make use of light that varies in energy from 6meV to 15KeV. Imaging and spectroscopy are finding applications in surface science, bulk materials analysis, semiconductor structures, particulate contaminants, magnetic thin films, biology and environmental science. This article is an overview and status report from the developers of some of these techniques at the ALS. The following table lists all the currently available microscopes at the. ALS. This article will describe some of the microscopes and some of the early applications

  14. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    Science.gov (United States)

    2010-12-01

    technical competence for the type of tests and calibrations SCALe undertakes. Testing and calibration laboratories that comply with ISO / IEC 17025 ...and exec t [ ISO / IEC 2005]. f a software system indicates that the SCALe analysis di by a CERT secure coding standard. Successful conforma antees that...to be more secure than non- systems. However, no study has yet been performed to p t ssment in accordance with ISO / IEC 17000: “a demonstr g to a

  15. Analysis of glottal source parameters in Parkinsonian speech.

    Science.gov (United States)

    Hanratty, Jane; Deegan, Catherine; Walsh, Mary; Kirkpatrick, Barry

    2016-08-01

    Diagnosis and monitoring of Parkinson's disease has a number of challenges as there is no definitive biomarker despite the broad range of symptoms. Research is ongoing to produce objective measures that can either diagnose Parkinson's or act as an objective decision support tool. Recent research on speech based measures have demonstrated promising results. This study aims to investigate the characteristics of the glottal source signal in Parkinsonian speech. An experiment is conducted in which a selection of glottal parameters are tested for their ability to discriminate between healthy and Parkinsonian speech. Results for each glottal parameter are presented for a database of 50 healthy speakers and a database of 16 speakers with Parkinsonian speech symptoms. Receiver operating characteristic (ROC) curves were employed to analyse the results and the area under the ROC curve (AUC) values were used to quantify the performance of each glottal parameter. The results indicate that glottal parameters can be used to discriminate between healthy and Parkinsonian speech, although results varied for each parameter tested. For the task of separating healthy and Parkinsonian speech, 2 out of the 7 glottal parameters tested produced AUC values of over 0.9.

  16. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  17. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  18. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  19. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg 252 Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17 O. Detection sensitivities of 239 Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  20. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  1. Modeling and analysis of a transcritical rankine power cycle with a low grade heat source

    DEFF Research Database (Denmark)

    Nguyen, Chan; Veje, Christian

    efficiency, exergetic efficiency and specific net power output. A generic cycle configuration has been used for analysis of a geothermal energy heat source. This model has been validated against similar calculations using industrial waste heat as the energy source. Calculations are done with fixed...

  2. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    Science.gov (United States)

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  3. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  4. Open Source Parallel Image Analysis and Machine Learning Pipeline, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Continuum Analytics proposes a Python-based open-source data analysis machine learning pipeline toolkit for satellite data processing, weather and climate data...

  5. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  6. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.; Pastor-Galán, D.; van Hinsbergen, D.J.J.

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  7. Montage in future building practice

    DEFF Research Database (Denmark)

    Bundgaard, Charlotte

    2002-01-01

    In this essay Charlotte Bundgaard describes the vision of Le Corbusier's Dom-ino project as an icon of the modernist dream, strongly connected with its contemporary industrial possibilies. She proceeds to examine a number of other projects from the 1920's to the 1970's that are also based...

  8. Comparative Genomic Analysis of Mannheimia haemolytica from Bovine Sources.

    Science.gov (United States)

    Klima, Cassidy L; Cook, Shaun R; Zaheer, Rahat; Laing, Chad; Gannon, Vick P; Xu, Yong; Rasmussen, Jay; Potter, Andrew; Hendrick, Steve; Alexander, Trevor W; McAllister, Tim A

    2016-01-01

    Bovine respiratory disease is a common health problem in beef production. The primary bacterial agent involved, Mannheimia haemolytica, is a target for antimicrobial therapy and at risk for associated antimicrobial resistance development. The role of M. haemolytica in pathogenesis is linked to serotype with serotypes 1 (S1) and 6 (S6) isolated from pneumonic lesions and serotype 2 (S2) found in the upper respiratory tract of healthy animals. Here, we sequenced the genomes of 11 strains of M. haemolytica, representing all three serotypes and performed comparative genomics analysis to identify genetic features that may contribute to pathogenesis. Possible virulence associated genes were identified within 14 distinct prophage, including a periplasmic chaperone, a lipoprotein, peptidoglycan glycosyltransferase and a stress response protein. Prophage content ranged from 2-8 per genome, but was higher in S1 and S6 strains. A type I-C CRISPR-Cas system was identified in each strain with spacer diversity and organization conserved among serotypes. The majority of spacers occur in S1 and S6 strains and originate from phage suggesting that serotypes 1 and 6 may be more resistant to phage predation. However, two spacers complementary to the host chromosome targeting a UDP-N-acetylglucosamine 2-epimerase and a glycosyl transferases group 1 gene are present in S1 and S6 strains only indicating these serotypes may employ CRISPR-Cas to regulate gene expression to avoid host immune responses or enhance adhesion during infection. Integrative conjugative elements are present in nine of the eleven genomes. Three of these harbor extensive multi-drug resistance cassettes encoding resistance against the majority of drugs used to combat infection in beef cattle, including macrolides and tetracyclines used in human medicine. The findings here identify key features that are likely contributing to serotype related pathogenesis and specific targets for vaccine design intended to reduce the

  9. Comparative Genomic Analysis of Mannheimia haemolytica from Bovine Sources.

    Directory of Open Access Journals (Sweden)

    Cassidy L Klima

    Full Text Available Bovine respiratory disease is a common health problem in beef production. The primary bacterial agent involved, Mannheimia haemolytica, is a target for antimicrobial therapy and at risk for associated antimicrobial resistance development. The role of M. haemolytica in pathogenesis is linked to serotype with serotypes 1 (S1 and 6 (S6 isolated from pneumonic lesions and serotype 2 (S2 found in the upper respiratory tract of healthy animals. Here, we sequenced the genomes of 11 strains of M. haemolytica, representing all three serotypes and performed comparative genomics analysis to identify genetic features that may contribute to pathogenesis. Possible virulence associated genes were identified within 14 distinct prophage, including a periplasmic chaperone, a lipoprotein, peptidoglycan glycosyltransferase and a stress response protein. Prophage content ranged from 2-8 per genome, but was higher in S1 and S6 strains. A type I-C CRISPR-Cas system was identified in each strain with spacer diversity and organization conserved among serotypes. The majority of spacers occur in S1 and S6 strains and originate from phage suggesting that serotypes 1 and 6 may be more resistant to phage predation. However, two spacers complementary to the host chromosome targeting a UDP-N-acetylglucosamine 2-epimerase and a glycosyl transferases group 1 gene are present in S1 and S6 strains only indicating these serotypes may employ CRISPR-Cas to regulate gene expression to avoid host immune responses or enhance adhesion during infection. Integrative conjugative elements are present in nine of the eleven genomes. Three of these harbor extensive multi-drug resistance cassettes encoding resistance against the majority of drugs used to combat infection in beef cattle, including macrolides and tetracyclines used in human medicine. The findings here identify key features that are likely contributing to serotype related pathogenesis and specific targets for vaccine design

  10. Hydrodynamic analysis of potential groundwater extraction capacity increase: case study of 'Nelt' groundwater source at Dobanovci

    Directory of Open Access Journals (Sweden)

    Bajić Dragoljub I.

    2017-01-01

    Full Text Available A comprehensive hydrodynamic analysis of the groundwater regime undertaken to assess the potential for expanding the 'Nelt' groundwater source at Dobanovci, or developing a new groundwater source for a future baby food factory, including the quantification of the impact on the production wells of the nearby 'Pepsi' groundwater source, is presented in the paper. The existing Nelt source is comprised of three active production wells that tap a subartesian aquifer formed in sands and gravelly sands; however, the analysis considers only the two nearest wells. A long-term group pumping test was con-ducted of production wells N-1 and N2 (Nelt source and production wells B-1 and B-2 (Pepsi source, while the piezometric head in the vicinity of these wells was monitored at observation well P-1, which is located in the area considered for Nelt source expansion. Data were collected at maximum pumping capacity of all the production wells. A hydrodynamic model of groundwater flow in the extended area of the Nelt source was generated for the purposes of the comprehensive hydrodynamic analysis. Hydrodynamic prognostic calculations addressed two solution alternatives for the capacity increase over a period of ten years. Licensed Visual MODFLOW Pro software, deemed to be at the very top in this field, was used for the calculations.

  11. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  12. Energy and exergy analysis of a double effect absorption refrigeration system based on different heat sources

    International Nuclear Information System (INIS)

    Kaynakli, Omer; Saka, Kenan; Kaynakli, Faruk

    2015-01-01

    Highlights: • Energy and exergy analysis was performed on double effect series flow absorption refrigeration system. • The refrigeration system runs on various heat sources such as hot water, hot air and steam. • A comparative analysis was carried out on these heat sources in terms of exergy destruction and mass flow rate of heat source. • The effect of heat sources on the exergy destruction of high pressure generator was investigated. - Abstract: Absorption refrigeration systems are environmental friendly since they can utilize industrial waste heat and/or solar energy. In terms of heat source of the systems, researchers prefer one type heat source usually such as hot water or steam. Some studies can be free from environment. In this study, energy and exergy analysis is performed on a double effect series flow absorption refrigeration system with water/lithium bromide as working fluid pair. The refrigeration system runs on various heat sources such as hot water, hot air and steam via High Pressure Generator (HPG) because of hot water/steam and hot air are the most common available heat source for absorption applications but the first law of thermodynamics may not be sufficient analyze the absorption refrigeration system and to show the difference of utilize for different type heat source. On the other hand operation temperatures of the overall system and its components have a major effect on their performance and functionality. In this regard, a parametric study conducted here to investigate this effect on heat capacity and exergy destruction of the HPG, coefficient of performance (COP) of the system, and mass flow rate of heat sources. Also, a comparative analysis is carried out on several heat sources (e.g. hot water, hot air and steam) in terms of exergy destruction and mass flow rate of heat source. From the analyses it is observed that exergy destruction of the HPG increases at higher temperature of the heat sources, condenser and absorber, and lower

  13. Polarisation analysis of elastic neutron scattering using a filter spectrometer on a pulsed source

    International Nuclear Information System (INIS)

    Mayers, J.; Williams, W.G.

    1981-05-01

    The experimental and theoretical aspects of the polarisation analysis technique in elastic neutron scattering are described. An outline design is presented for a filter polarisation analysis spectrometer on the Rutherford Laboratory Spallation Neutron Source and estimates made of its expected count rates and resolution. (author)

  14. Le discours rappporté comme effet de montage du discours citant et du segment citationnel. Contribution à l’étude du discours journalistique

    Directory of Open Access Journals (Sweden)

    Biardzka Elżbieta

    2012-07-01

    règles de la syntaxe (combinatoire contrainte. Les séquences libres et contraintes du DR relevées dans notre corpus sont répertoriées selon les quatre positions principales que peut prendre le DC par rapport à la Cit: DC antéposé (DC+Cit, DC postposé (Cit+ DC, DC intercalé dans la Cit (Cit+DC+Cit et DC encadrant la Cit (DC+Cit+DC. Dans notre communication, nous décrivons des pratiques choisies du DR s'inscrivant dans le montage libre et contraint des types DR=DC+Cit et DR+ Cit+DC.

  15. Application of Abaqus to analysis of the temperature field in elements heated by moving heat sources

    Directory of Open Access Journals (Sweden)

    W. Piekarska

    2010-10-01

    Full Text Available Numerical analysis of thermal phenomena occurring during laser beam heating is presented in this paper. Numerical models of surface andvolumetric heat sources were presented and the influence of different laser beam heat source power distribution on temperature field wasanalyzed. Temperature field was obtained by a numerical solution the transient heat transfer equation with activity of inner heat sources using finite element method. Temperature distribution analysis in welded joint was performed in the ABAQUS/Standard solver. The DFLUXsubroutine was used for implementation of the movable welding heat source model. Temperature-depended thermophysical properties for steelwere assumed in computer simulations. Temperature distribution in laser beam surface heated and butt welded plates was numericallyestimated.

  16. SWOT analysis of the renewable energy sources in Romania - case study: solar energy

    Science.gov (United States)

    Lupu, A. G.; Dumencu, A.; Atanasiu, M. V.; Panaite, C. E.; Dumitrașcu, Gh; Popescu, A.

    2016-08-01

    The evolution of energy sector worldwide triggered intense preoccupation on both finding alternative renewable energy sources and environmental issues. Romania is considered to have technological potential and geographical location suitable to renewable energy usage for electricity generation. But this high potential is not fully exploited in the context of policies and regulations adopted globally, and more specific, European Union (EU) environmental and energy strategies and legislation related to renewable energy sources. This SWOT analysis of solar energy source presents the state of the art, potential and future prospects for development of renewable energy in Romania. The analysis concluded that the development of solar energy sector in Romania depends largely on: viability of legislative framework on renewable energy sources, increased subsidies for solar R&D, simplified methodology of green certificates, and educating the public, investors, developers and decision-makers.

  17. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Science.gov (United States)

    Muthuraman, Muthuraman; Hellriegel, Helge; Hoogenboom, Nienke; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan

    2014-01-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz) and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS) with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC). MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR) level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT) task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  18. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Directory of Open Access Journals (Sweden)

    Muthuraman Muthuraman

    Full Text Available Electroencephalography (EEG and magnetoencephalography (MEG are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC. MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  19. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  20. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  1. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  2. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  3. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    Science.gov (United States)

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  4. Acoustic Source Analysis of Magnetoacoustic Tomography With Magnetic Induction for Conductivity Gradual-Varying Tissues.

    Science.gov (United States)

    Wang, Jiawei; Zhou, Yuqi; Sun, Xiaodong; Ma, Qingyu; Zhang, Dong

    2016-04-01

    As a multiphysics imaging approach, magnetoacoustic tomography with magnetic induction (MAT-MI) works on the physical mechanism of magnetic excitation, acoustic vibration, and transmission. Based on the theoretical analysis of the source vibration, numerical studies are conducted to simulate the pathological changes of tissues for a single-layer cylindrical conductivity gradual-varying model and estimate the strengths of sources inside the model. The results suggest that the inner source is generated by the product of the conductivity and the curl of the induced electric intensity inside conductivity homogeneous medium, while the boundary source is produced by the cross product of the gradient of conductivity and the induced electric intensity at conductivity boundary. For a biological tissue with low conductivity, the strength of boundary source is much higher than that of the inner source only when the size of conductivity transition zone is small. In this case, the tissue can be treated as a conductivity abrupt-varying model, ignoring the influence of inner source. Otherwise, the contributions of inner and boundary sources should be evaluated together quantitatively. This study provide basis for further study of precise image reconstruction of MAT-MI for pathological tissues.

  5. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  6. USING THE METHODS OF WAVELET ANALYSIS AND SINGULAR SPECTRUM ANALYSIS IN THE STUDY OF RADIO SOURCE BL LAC

    OpenAIRE

    Donskykh, G. I.; Ryabov, M. I.; Sukharev, A. I.; Aller, M.

    2014-01-01

    We investigated the monitoring data of extragalactic source BL Lac. This monitoring was held withUniversityofMichigan26-meter radio  telescope. To study flux density of extragalactic source BL Lac at frequencies of 14.5, 8 and 4.8 GHz, the wavelet analysis and singular spectrum analysis were used. Calculating the integral wavelet spectra allowed revealing long-term  components  (~7-8 years) and short-term components (~ 1-4 years) in BL Lac. Studying of VLBI radio maps (by the program Mojave) ...

  7. System optimization for continuous on-stream elemental analysis using low-output isotopic neutron sources

    International Nuclear Information System (INIS)

    Rizk, R.A.M.

    1989-01-01

    In continuous on-stream neutron activation analysis, the material to be analyzed may be continuously recirculated in a closed loop system between an activation source and a shielded detector. In this paper an analytical formulation of the detector response for such a system is presented. This formulation should be useful in optimizing the system design parameters for specific applications. A study has been made of all parameters that influence the detector response during on-stream analysis. Feasibility applications of the method to solutions of manganese and vanadium using a 5 μg 252 Cf neutron source are demonstrated. (author)

  8. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  9. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    Science.gov (United States)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  10. Off-design performance analysis of Kalina cycle for low temperature geothermal source

    International Nuclear Information System (INIS)

    Li, Hang; Hu, Dongshuai; Wang, Mingkun; Dai, Yiping

    2016-01-01

    Highlights: • The off-design performance analysis of Kalina cycle is conducted. • The off-design models are established. • The genetic algorithm is used in the design phase. • The sliding pressure control strategy is applied. - Abstract: Low temperature geothermal sources with brilliant prospects have attracted more and more people’s attention. Kalina cycle system using ammonia water as working fluid could exploit geothermal energy effectively. In this paper, the quantitative analysis of off-design performance of Kalina cycle for the low temperature geothermal source is conducted. The off-design models including turbine, pump and heat exchangers are established preliminarily. Genetic algorithm is used to maximize the net power output and determine the thermodynamic parameters in the design phase. The sliding pressure control strategy applied widely in existing Rankine cycle power plants is adopted to response to the variations of geothermal source mass flow rate ratio (70–120%), geothermal source temperature (116–128 °C) and heat sink temperature (0–35 °C). In the off-design research scopes, the guidance for pump rotational speed adjustment is listed to provide some reference for off-design operation of geothermal power plants. The required adjustment rate of pump rotational speed is more sensitive to per unit geothermal source temperature than per unit heat sink temperature. Influence of the heat sink variation is greater than that of the geothermal source variation on the ranges of net power output and thermal efficiency.

  11. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  12. Sensitivity analysis of the relationship between disease occurrence and distance from a putative source of pollution

    Directory of Open Access Journals (Sweden)

    Emanuela Dreassi

    2008-05-01

    Full Text Available The relation between disease risk and a point source of pollution is usually investigated using distance from the source as a proxy of exposure. The analysis may be based on case-control data or on aggregated data. The definition of the function relating risk of disease and distance is critical, both in a classical and in a Bayesian framework, because the likelihood is usually very flat, even with large amounts of data. In this paper we investigate how the specification of the function relating risk of disease with distance from the source and of the prior distributions on the parameters of the function affects the results when case-control data and Bayesian methods are used. We consider different popular parametric models for the risk distance function in a Bayesian approach, comparing estimates with those derived by maximum likelihood. As an example we have analyzed the relationship between a putative source of environmental pollution (an asbestos cement plant and the occurrence of pleural malignant mesothelioma in the area of Casale Monferrato (Italy in 1987-1993. Risk of pleural malignant mesothelioma turns out to be strongly related to distance from the asbestos cement plant. However, as the models appeared to be sensitive to modeling choices, we suggest that any analysis of disease risk around a putative source should be integrated with a careful sensitivity analysis and possibly with prior knowledge. The choice of prior distribution is extremely important and should be based on epidemiological considerations.

  13. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium-Beryllium source

    Energy Technology Data Exchange (ETDEWEB)

    Didi, Abdessamad; Dadouch, Ahmed; Tajmouati, Jaouad; Bekkouri, Hassane [Advanced Technology and Integration System, Dept. of Physics, Faculty of Science Dhar Mehraz, University Sidi Mohamed Ben Abdellah, Fez (Morocco); Jai, Otman [Laboratory of Radiation and Nuclear Systems, Dept. of Physics, Faculty of Sciences, Tetouan (Morocco)

    2017-06-15

    Americium–beryllium (Am-Be; n, γ) is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci), yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources) experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  14. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  15. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  16. IQM: an extensible and portable open source application for image and signal analysis in Java.

    Science.gov (United States)

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  17. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  18. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  19. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  20. Analysis of rod drop and pulsed source measurements of reactivity in the Winfrith SGHWR

    International Nuclear Information System (INIS)

    Brittain, I.

    1970-05-01

    Reactivity measurements by the rod-drop and pulsed source methods in the Winfrith SGHWR are seriously affected by spatial harmonics. A method of calculation is described which enables the spatial harmonics to be calculated in non-uniform cores in two or three dimensions, and thus allows a much more rigorous analysis of the experimental results than the usual point model. The method is used to analyse all the rod-drop measurements made during commissioning of the Winfrith SGHWR, and to comment on the results of pulsed source measurements. The reactivity worths of banks of ten and twelve shut-down tubes deduced from rod-drop and pulsed source experiments are in satisfactory agreement with each other and also with AIMAZ calculated values. The ability to calculate higher spatial harmonics in nonuniform cores is thought to be new, and may have a wider application to reactor kinetics through the method of Modal Analysis. (author)

  1. Python Materials Genomics (pymatgen): A robust, open-source python library for materials analysis

    OpenAIRE

    Ong, Shyue Ping; Richards, William Davidson; Jain, Anubhav; Hautier, Geoffroy; Kocher, Michael; Cholia, Shreyas; Gunter, Dan; Chevrier, Vincent L.; Persson, Kristin A.; Ceder, Gerbrand

    2012-01-01

    We present the Python Materials Genomics (pymatgen) library, a robust, open-source Python library for materials analysis. A key enabler in high-throughput computational materials science efforts is a robust set of software tools to perform initial setup for the calculations (e.g., generation of structures and necessary input files) and post-calculation analysis to derive useful material properties from raw calculated data. The pymatgen library aims to meet these needs by (1) defining core Pyt...

  2. Analysis of the monitoring system for the spallation neutron source 'SINQ'

    International Nuclear Information System (INIS)

    Badreddin, E.

    1998-01-01

    Petri Net models (PN) and Fault-Tree Analysis (FTA) are employed for the purpose of reliability analysis of the spallation neutron source SINQ. The monitoring and shut-down system (SDS) structure is investigated using a Petri-Net model. The reliability data are processed using a Fault-Tree model of the dominant part. Finally, suggestions for the improvement of system availability are made. (author)

  3. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  4. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    OpenAIRE

    Yang, Bing; Liu, Yan

    2013-01-01

    A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and ...

  5. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  6. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  7. Fiber Based Mid Infrared Supercontinuum Source for Spectroscopic Analysis in Food Production

    DEFF Research Database (Denmark)

    Ramsay, Jacob; Dupont, Sune Vestergaard Lund; Keiding, Søren Rud

    Optimization of sustainable food production is a worldwide challenge that is undergoing continuous development as new technologies emerge. Applying solutions for food analysis with novel bright and broad mid-infrared (MIR) light sources has the potential to meet the increasing demands for food...

  8. Perception and acceptance of technological risk sources. Volume 2. Empirical analysis of risk perception and acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Renn, O

    1981-01-01

    Volume 2 presents a comparative investigation of risk perception and acceptance. It contains the evaluations of the two experiments in social psychology and the analysis of two intensive inquiries concerning risk perception with a view to 12 different risk sources. The data of the two inquiries were acquired from a total of 200 interview partners in two cities in North-Rhine Westphalia.

  9. The adoption of total cost of ownership for sourcing decisions - a structural equations analysis

    NARCIS (Netherlands)

    Wouters, Marc; Anderson, James C.; Wynstra, Finn

    2005-01-01

    This study investigates the adoption of total cost of ownership (TCO) analysis to improve sourcing decisions. TCO can be seen as an application of activity based costing (ABC) that quantifies the costs that are involved in acquiring and using purchased goods or services. TCO supports purchasing

  10. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  11. Dynamic Stability Analysis of Autonomous Medium-Voltage Mixed-Source Microgrid

    DEFF Research Database (Denmark)

    Zhao, Zhuoli; Yang, Ping; Guerrero, Josep M.

    2015-01-01

    -space model of the autonomous MV mixed-source microgrid containing diesel generator set (DGS), grid-supporting battery energy storage system (BESS), squirrel cage induction generator (SCIG) wind turbine and network is developed. Sensitivity analysis is carried out to reveal the dynamic stability margin...

  12. Analysis of filtration properties of locally sourced base oil for the ...

    African Journals Online (AJOL)

    This study examines the use of locally sourced oil like, groundnut oil, melon oil, vegetable oil, soya oil and palm oil as substitute for diesel oil in formulating oil base drilling fluids relative to filtration properties. The filtrate volumes of each of the oils were obtained for filtration control analysis. With increasing potash and ...

  13. Analysis of the image of pion-emitting sources in the source center-of-mass frame

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Yanyu; Feng, Qichun; Huo, Lei; Zhang, Jingbo; Liu, Jianli; Tang, Guixin [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Zhang, Weining [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Dalian University of Technology, School of Physics and Optoelectronic Technology, Dalian, Liaoning (China)

    2017-08-15

    In this paper, we try a method to extract the image of pion-emitting source function in the center-of-mass frame of the source (CMFS). We choose identical pion pairs according to the difference of their energy and use these pion pairs to build the correlation function. The purpose is to reduce the effect of ΔEΔt, thus the corresponding imaging result can tend to the real source function. We examine the effect of this method by comparing its results with real source functions extracted from models directly. (orig.)

  14. Dynamic response analysis of the LBL Advanced Light Source synchrotron radiation storage ring

    International Nuclear Information System (INIS)

    Leung, K.

    1993-05-01

    This paper presents the dynamic response analysis of the photon source synchrotron radiation storage ring excited by ground motion measured at the Lawrence Berkeley Laboratory advanced light source building site. The high spectral brilliance requirement the photon beams of the advanced light source storage ring specified displacement of the quadrupole focusing magnets in the order of 1 micron in vertical motion.There are 19 magnets supported by a 430-inch steel box beam girder. The girder and all magnets are supported by the kinematic mount system normally used in optical equipment. The kinematic mount called a six-strut magnet support system is now considered as an alternative system for supporting SSC magnets in the Super Collider. The effectively designed and effectively operated six-strut support system is now successfully operated for the Advanced Light Source (ALS) accelerator at the Lawrence Berkeley Laboratory. This paper will present the method of analysis and results of the dynamic motion study at the center of the magnets under the most critical excitation source as recorded at the LBL site

  15. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  16. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  17. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  18. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  19. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This project had three major objectives. The first objective was to develop a fossil fuel combustion source inventory (NO/sub x/, SO/sub x/, and hydrocarbon emissions) that would be relatively easy to use and update for analyzing the impact of combustion emissions on acid deposition in the eastern United States. The second objective of the project was to use the inventory data as a basis for selection of a number of areas that, by virtue of their importance in the acid rain issue, could be further studied to assess the impact of local and intraregional combustion sources. The third objective was to conduct an analysis of wet deposition monitoring data in the areas under study, along with pertinent physical characteristics, meteorological conditions, and emission patterns of these areas, to investigate probable relationships between local and intraregional combustion sources and the deposition of acidic material. The combustion source emissions inventory has been developed for the eastern United States. It characterizes all important area sources and point sources on a county-by-county basis. Its design provides flexibility and simplicity and makes it uniquely useful in overall analysis of emission patterns in the eastern United States. Three regions with basically different emission patterns have been identified and characterized. The statistical analysis of wet deposition monitoring data in conjunction with emission patterns, wind direction, and topography has produced consistent results for each study area and has demonstrated that the wet deposition in each area reflects the characteristics of the localized area around the monitoring sites (typically 50 to 150 miles). 8 references, 28 figures, 39 tables.

  20. Arguments and sources on Italian online forums on childhood vaccinations: Results of a content analysis.

    Science.gov (United States)

    Fadda, Marta; Allam, Ahmed; Schulz, Peter J

    2015-12-16

    Despite being committed to the immunization agenda set by the WHO, Italy is currently experiencing decreasing vaccination rates and increasing incidence of vaccine-preventable diseases. Our aim is to analyze Italian online debates on pediatric immunizations through a content analytic approach in order to quantitatively evaluate and summarize users' arguments and information sources. Threads were extracted from 3 Italian forums. Threads had to include the keyword Vaccin* in the title, focus on childhood vaccination, and include at least 10 posts. They had to have been started between 2008 and June 2014. High inter-coder reliability was achieved. Exploratory analysis using k-means clustering was performed to identify users' posting patterns for arguments about vaccines and sources. The analysis included 6544 posts mentioning 6223 arguments about pediatric vaccinations and citing 4067 sources. The analysis of argument posting patterns included users who published a sufficient number of posts; they generated 85% of all arguments on the forum. Dominating patterns of three groups were identified: (1) an anti-vaccination group (n=280) posted arguments against vaccinations, (2) a general pro-vaccination group (n=222) posted substantially diverse arguments supporting vaccination and (3) a safety-focused pro-vaccination group (n=158) mainly forwarded arguments that questioned the negative side effects of vaccination. The anti-vaccination group was shown to be more active than the others. They use multiple sources, own experience and media as their cited sources of information. Medical professionals were among the cited sources of all three groups, suggesting that vaccination-adverse professionals are gaining attention. Knowing which information is shared online on the topic of pediatric vaccinations could shed light on why immunization rates have been decreasing and what strategies would be best suited to address parental concerns. This suggests there is a high need for

  1. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  2. Identification of sources of heavy metals in the Dutch atmosphere using air filter and lichen analysis

    International Nuclear Information System (INIS)

    de Bruin, M.; Wolterbeek, H.T.

    1984-01-01

    Aerosol samples collected in an industrialized region were analyzed by instrumental neutron activation analysis. Correlation with wind direction and factor analysis were applied to the concentration data to obtain information on the nature and position of the sources. Epiphytic lichens were sampled over the country and analyzed for heavy metals (As, Cd, Sc, Zn, Sb). The data were interpreted by geographically plotting element concentrations and enrichment factors, and by factor analysis. Some pitfalls are discussed which are associated with the use of aerosol and lichen data in studies of heavy metal air pollution. 14 references, 8 figures, 3 tables

  3. Source location in plates based on the multiple sensors array method and wavelet analysis

    International Nuclear Information System (INIS)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  4. Source location in plates based on the multiple sensors array method and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  5. Irradiation Pattern Analysis for Designing Light Sources-Based on Light Emitting Diodes

    International Nuclear Information System (INIS)

    Rojas, E.; Stolik, S.; La Rosa, J. de; Valor, A.

    2016-01-01

    Nowadays it is possible to design light sources with a specific irradiation pattern for many applications. Light Emitting Diodes present features like high luminous efficiency, durability, reliability, flexibility, among others as the result of its rapid development. In this paper the analysis of the irradiation pattern of the light emitting diodes is presented. The approximation of these irradiation patterns to both, a Lambertian, as well as a Gaussian functions for the design of light sources is proposed. Finally, the obtained results and the functionality of bringing the irradiation pattern of the light emitting diodes to these functions are discussed. (Author)

  6. Design of a setup for {sup 252}Cf neutron source for storage and analysis purpose

    Energy Technology Data Exchange (ETDEWEB)

    Hei, Daqian [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Zhuang, Haocheng [Xi’an Middle School of Shanxi Province, Xi’an 710000 (China); Jia, Wenbao, E-mail: jiawenbao@163.com [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China); Cheng, Can; Jiang, Zhou; Wang, Hongtao [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Chen, Da [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China)

    2016-11-01

    {sup 252}Cf is a reliable isotopic neutron source and widely used in the prompt gamma ray neutron activation analysis (PGNAA) technique. A cylindrical barrel made by polymethyl methacrylate contained with the boric acid solution was designed for storage and application of a 5 μg {sup 252}Cf neutron source. The size of the setup was optimized with Monte Carlo code. The experiments were performed and the results showed the doses were reduced with the setup and less than the allowable limit. The intensity and collimating radius of the neutron beam could also be adjusted through different collimator.

  7. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  8. Analysis of a carbon dioxide transcritical power cycle using a low temperature source

    International Nuclear Information System (INIS)

    Cayer, Emmanuel; Galanis, Nicolas; Desilets, Martin; Nesreddine, Hakim; Roy, Philippe

    2009-01-01

    A detailed analysis of a carbon dioxide transcritical power cycle using an industrial low-grade stream of process gases as its heat source is presented. The methodology is divided in four steps: energy analysis, exergy analysis, finite size thermodynamics and calculation of the heat exchangers' surface. The results have been calculated for fixed temperature and mass flow rate of the heat source, fixed maximum and minimum temperatures in the cycle and a fixed sink temperature by varying the high pressure of the cycle and its net power output. The main results show the existence of an optimum high pressure for each of the four steps; in the first two steps, the optimum pressure maximises the thermal or exergetic efficiency while in the last two steps it minimises the product UA or the heat exchangers' surface. These high pressures are very similar for the energy and exergy analyses. The last two steps also have nearly identical optimizing high pressures that are significantly lower that the ones for the first two steps. In addition, the results show that the augmentation of the net power output produced from the limited energy source has no influence on the results of the energy analysis, decreases the exergetic efficiency and increases the heat exchangers' surface. Changing the net power output has no significant impact on the high pressures optimizing each of the four steps

  9. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  10. Multi-Criteria Analysis to Prioritize Energy Sources for Ambience in Poultry Production

    Directory of Open Access Journals (Sweden)

    DC Collatto

    Full Text Available ABSTRACT This paper intends to outline a model of multi-criteria analysis to pinpoint the most suitable energy source for heating aviaries in poultry broiler production from the point of view of the farmer and under environmental logic. Therefore, the identification of criteria was enabled through an exploratory study in three poultry broiler production units located in the mountain region of Rio Grande do Sul. In order to identify the energy source, the Analytic Hierarchy Process was applied. The criteria determined and validated in the research contemplated the cost of energy source, leadtime, investment in equipment, energy efficiency, quality of life and environmental impacts. The result of applying the method revealed firewood as the most appropriate energy for heating. The decision support model developed could be replicated in order to strengthen the criteria and energy alternatives presented, besides identifying new criteria and alternatives that were not considered in this study.

  11. A THEORETICAL ANALYSIS OF KEY POINTS WHEN CHOOSING OPEN SOURCE ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Fernando Gustavo Dos Santos Gripe

    2011-08-01

    Full Text Available The present work is aimed at presenting a theoretical analysis of the main features of Open Source ERP systems, herein identified as success technical factors, in order to contribute to the establishment of parameters to be used in decision-making processes when choosing a system which fulfills the organization´s needs. Initially, the life cycle of ERP systems is contextualized, highlighting the features of Open Source ERP systems. As a result, it was verified that, when carefully analyzed, these systems need further attention regarding issues of project continuity and maturity, structure, transparency, updating frequency, and support, all of which are inherent to the reality of this type of software. Nevertheless, advantages were observed in what concerns flexibility, costs, and non-discontinuity as benefits. The main goal is to broaden the discussion about the adoption of Open Source ERP systems.

  12. Neutron activation analysis of essential elements in Multani mitti clay using miniature neutron source reactor

    International Nuclear Information System (INIS)

    Waheed, S.; Rahman, S.; Faiz, Y.; Siddique, N.

    2012-01-01

    Multani mitti clay was studied for 19 essential and other elements. Four different radio-assay schemes were adopted for instrumental neutron activation analysis (INAA) using miniature neutron source reactor. The estimated weekly intakes of Cr and Fe are high for men, women, pregnant and lactating women and children while intake of Co is higher in adult categories and Mn by pregnant women. Comparison of MM clay with other type of clays shows that it is a good source of essential elements. - Highlights: ► Multani mitti clay has been studied for 19 essential elements for human adequacy and safety using INAA and AAS. ► Weekly intakes for different consumer categories have been calculated and compared with DRIs. ► Comparison of MM with other type of clays depict that MM clay is a good source of essential elements.

  13. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  14. Use of the spectral analysis for estimating the intensity of a weak periodic source

    International Nuclear Information System (INIS)

    Marseguerra, M.

    1989-01-01

    This paper deals with the possibility of exploiting spectral methods for the analysis of counting experiments in which one has to estimate the intensity of a weak periodic source of particles buried in a high background. The general theoretical expressions here obtained for the auto- and cross-spectra are applied to three kinds of simulated experiments. In all cases it turns out that the source intensity can acutally be estimated with a standard deviation comparable with that obtained in classical experiments in which the source can be moved out. Thus the spectral methods represent an interesting technique nowadays easy to implement on low-cost computers which could also be used in many research fields by suitably redesigning classical experiments. The convenience of using these methods in the field of nuclear safeguards is presently investigated in our Institute. (orig.)

  15. Physical activity and social support in adolescents: analysis of different types and sources of social support.

    Science.gov (United States)

    Mendonça, Gerfeson; Júnior, José Cazuza de Farias

    2015-01-01

    Little is known about the influence of different types and sources of social support on physical activity in adolescents. The aim of this study was to analyse the association between physical activity and different types and sources of social support in adolescents. The sample consisted of 2,859 adolescents between 14-19 years of age in the city of João Pessoa, in Northeastern Brazil. Physical activity was measured with a questionnaire and social support from parents and friends using a 10-item scale five for each group (type of support: encouragement, joint participation, watching, inviting, positive comments and transportation). Multivariable analysis showed that the types of support provided by parents associated with physical activity in adolescents were encouragement for females (P genders (males: P = 0.009; females: P physical activity varies according to its source, as well as the gender and age of the adolescents.

  16. Systems analysis and engineering of the X-1 Advanced Radiation Source

    International Nuclear Information System (INIS)

    Rochau, G.E.; Hands, J.A.; Raglin, P.S.; Ramirez, J.J.

    1998-01-01

    The X-1 Advanced Radiation Source, which will produce ∼ 16 MJ in x-rays, represents the next step in providing US Department of Energy's Stockpile Stewardship program with the high-energy, large volume, laboratory x-ray sources needed for the Radiation Effects Science and Simulation (RES), Inertial Confinement Fusion (ICF), and Weapon Physics (WP) Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator in 1997 provide sufficient basis for pursuing the development of X-1. This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the systems analysis and engineering approach being used, and identify critical technology areas being researched

  17. Source Apportionment and Influencing Factor Analysis of Residential Indoor PM2.5 in Beijing

    Science.gov (United States)

    Yang, Yibing; Liu, Liu; Xu, Chunyu; Li, Na; Liu, Zhe; Wang, Qin; Xu, Dongqun

    2018-01-01

    In order to identify the sources of indoor PM2.5 and to check which factors influence the concentration of indoor PM2.5 and chemical elements, indoor concentrations of PM2.5 and its related elements in residential houses in Beijing were explored. Indoor and outdoor PM2.5 samples that were monitored continuously for one week were collected. Indoor and outdoor concentrations of PM2.5 and 15 elements (Al, As, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, Pb, Se, Tl, V, Zn) were calculated and compared. The median indoor concentration of PM2.5 was 57.64 μg/m3. For elements in indoor PM2.5, Cd and As may be sensitive to indoor smoking, Zn, Ca and Al may be related to indoor sources other than smoking, Pb, V and Se may mainly come from outdoor. Five factors were extracted for indoor PM2.5 by factor analysis, explained 76.8% of total variance, outdoor sources contributed more than indoor sources. Multiple linear regression analysis for indoor PM2.5, Cd and Pb was performed. Indoor PM2.5 was influenced by factors including outdoor PM2.5, smoking during sampling, outdoor temperature and time of air conditioner use. Indoor Cd was affected by factors including smoking during sampling, outdoor Cd and building age. Indoor Pb concentration was associated with factors including outdoor Pb and time of window open per day, building age and RH. In conclusion, indoor PM2.5 mainly comes from outdoor sources, and the contributions of indoor sources also cannot be ignored. Factors associated indoor and outdoor air exchange can influence the concentrations of indoor PM2.5 and its constituents. PMID:29621164

  18. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo G.M.; Van Der Goot, Erik; Verile, Marco; Wolfart, Erik; Rutan Fowler, Marcy; Feldman, Yana; Hammond, William; Schweighardt, John; Ferguson, Mattew

    2013-01-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  19. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  20. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  1. Identifying sources of emerging organic contaminants in a mixed use watershed using principal components analysis.

    Science.gov (United States)

    Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L

    2014-01-01

    Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.

  2. Time Series Analysis of Monte Carlo Fission Sources - I: Dominance Ratio Computation

    International Nuclear Information System (INIS)

    Ueki, Taro; Brown, Forrest B.; Parsons, D. Kent; Warsa, James S.

    2004-01-01

    In the nuclear engineering community, the error propagation of the Monte Carlo fission source distribution through cycles is known to be a linear Markov process when the number of histories per cycle is sufficiently large. In the statistics community, linear Markov processes with linear observation functions are known to have an autoregressive moving average (ARMA) representation of orders p and p - 1. Therefore, one can perform ARMA fitting of the binned Monte Carlo fission source in order to compute physical and statistical quantities relevant to nuclear criticality analysis. In this work, the ARMA fitting of a binary Monte Carlo fission source has been successfully developed as a method to compute the dominance ratio, i.e., the ratio of the second-largest to the largest eigenvalues. The method is free of binning mesh refinement and does not require the alteration of the basic source iteration cycle algorithm. Numerical results are presented for problems with one-group isotropic, two-group linearly anisotropic, and continuous-energy cross sections. Also, a strategy for the analysis of eigenmodes higher than the second-largest eigenvalue is demonstrated numerically

  3. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    Science.gov (United States)

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  4. An Analysis of Air Pollution in Makkah - a View Point of Source Identification

    Directory of Open Access Journals (Sweden)

    Turki M. Habeebullah

    2013-07-01

    Full Text Available Makkah is one of the busiest cities in Saudi Arabia and remains busy all year around, especially during the season of Hajj and the month of Ramadan when millions of people visit this city. This emphasizes the importance of clean air and of understanding the sources of various air pollutants, which is vital for the management and advanced modeling of air pollution. This study intends to identify the major sources of air pollutants in Makkah, near the Holy Mosque (Al-Haram using a graphical approach. Air pollutants considered in this study are nitrogen oxides (NOx, nitrogen dioxide (NO2, nitric oxide (NO, carbon monoxide (CO, sulphur dioxide (SO2, ozone (O3 and particulate matter with aero-dynamic diameter of 10 um or less (PM10. Polar plots, time variation plots and correlation analysis are used to analyse the data and identify the major sources of emissions. Most of the pollutants demonstrate high concentrations during the morning traffic peak hours, suggesting road traffic as the main source of emission. The main sources of pollutant emissions identified in Makkahwere road traffic, re-suspended and windblown dust and sand particles. Further investigation on detailedsource apportionment is required, which is part of the ongoing project.

  5. Municipal solid waste source-separated collection in China: A comparative analysis

    International Nuclear Information System (INIS)

    Tai Jun; Zhang Weiqian; Che Yue; Feng Di

    2011-01-01

    A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.

  6. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  7. Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis

    Science.gov (United States)

    Moridnejad, A.; Karimi, N.; Ariya, P. A.

    2014-12-01

    The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.

  8. Major models and data sources for residential and commercial sector energy conservation analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    Major models and data sources are reviewed that can be used for energy-conservation analysis in the residential and commercial sectors to provide an introduction to the information that can or is available to DOE in order to further its efforts in analyzing and quantifying their policy and program requirements. Models and data sources examined in the residential sector are: ORNL Residential Energy Model; BECOM; NEPOOL; MATH/CHRDS; NIECS; Energy Consumption Data Base: Household Sector; Patterns of Energy Use by Electrical Appliances Data Base; Annual Housing Survey; 1970 Census of Housing; AIA Research Corporation Data Base; RECS; Solar Market Development Model; and ORNL Buildings Energy Use Data Book. Models and data sources examined in the commercial sector are: ORNL Commercial Sector Model of Energy Demand; BECOM; NEPOOL; Energy Consumption Data Base: Commercial Sector; F.W. Dodge Data Base; NFIB Energy Report for Small Businesses; ADL Commercial Sector Energy Use Data Base; AIA Research Corporation Data Base; Nonresidential Buildings Surveys of Energy Consumption; General Electric Co: Commercial Sector Data Base; The BOMA Commercial Sector Data Base; The Tishman-Syska and Hennessy Data Base; The NEMA Commercial Sector Data Base; ORNL Buildings Energy Use Data Book; and Solar Market Development Model. Purpose; basis for model structure; policy variables and parameters; level of regional, sectoral, and fuels detail; outputs; input requirements; sources of data; computer accessibility and requirements; and a bibliography are provided for each model and data source.

  9. Bulk - Samples gamma-rays activation analysis (PGNAA) with Isotopic Neutron Sources

    International Nuclear Information System (INIS)

    HASSAN, A.M.

    2009-01-01

    An overview is given on research towards the Prompt Gamma-ray Neutron Activation Analysis (PGNAA) of bulk-samples. Some aspects in bulk-sample PGNAA are discussed, where irradiation by isotopic neutron sources is used mostly for in-situ or on-line analysis. The research was carried out in a comparative and/or qualitative way or by using a prior knowledge about the sample material. Sometimes we need to use the assumption that the mass fractions of all determined elements add up to 1. The sensitivity curves are also used for some elements in such complex samples, just to estimate the exact percentage concentration values. The uses of 252 Cf, 241 Arn/Be and 239 Pu/Be isotopic neutron sources for elemental investigation of: hematite, ilmenite, coal, petroleum, edible oils, phosphates and pollutant lake water samples have been mentioned.

  10. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  11. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  12. PIXE Analysis and source identification of airborne particulate matter collected in Downtown Havana City

    International Nuclear Information System (INIS)

    Perez, G.; Pinnera, I; Ramos, M; Guibert, R; Molina, E.; Martinez, M.; Fernandez, A.; Aldape, F.; Flores, M.

    2009-01-01

    A set of samples containing airborne particulate matter (in two particle size fraction PM10 and PM2,5) collected during five months from November 2006 to April 2007 in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique and the concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were determined consistently in both particle size fractions with minimum detection limits in the range of ng/m3. A Gent air sampler was used for the aerosol collection in PM10 and PM2,5 particles simultaneously and the PIXE elemental analysis were performed using a proton beam of 2.5 MeV from the 2 MV Van de Graff Tandetron Accelerator at the ININ PIXE Laboratory in Mexico. The analytical database provided by PIXE was statistically analyzed in order to determine the promising local pollution sources. The statistical techniques of Multivariate Factor Analysis in combination with the Principal Component Analysis methods were applied to this data and allowed identifying five main pollution sources of airborne particulate matter (PM10 and PM2,5) collected in this area. The main (local) identified sources were: soil dust, sea spray, industry, fossil fuel combustion from motor vehicles and burnings or incinerations of diverse materials. A general discussion about these results is presented in this work. (Author)

  13. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  14. AtomicJ: An open source software for analysis of force curves

    Science.gov (United States)

    Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina

    2014-06-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.

  15. AtomicJ: An open source software for analysis of force curves

    International Nuclear Information System (INIS)

    Hermanowicz, Paweł; Gabryś, Halina; Sarna, Michał; Burda, Kvetoslava

    2014-01-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh

  16. Spectrum analysis of a voltage source converter due to semiconductor voltage drops

    DEFF Research Database (Denmark)

    Rasmussen, Tonny Wederberg; Eltouki, Mustafa

    2017-01-01

    It is known that power electronic voltage source converters are non-ideal. This paper presents a state-of-the-art review on the effect of semiconductor voltage drop on the output voltage spectrum, using single-phase H-bridge two-level converter topology with natural sampled pulse width modulation....... The paper describes the analysis of output voltage spectrum, when the semiconductor voltage drop is added. The results of the analysis of the spectral contribution including and excluding semiconductor voltage drop reveal a good agreement between the theoretical results, simulations and laboratory...

  17. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Bing Yang

    2013-01-01

    Full Text Available A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and control of the ring-plate-type cycloid reducer.

  18. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  19. Multicriteria analysis for sources of renewable energy using data from remote sensing

    Science.gov (United States)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  20. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  1. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    Directory of Open Access Journals (Sweden)

    Kooyman Timothée

    2017-01-01

    Full Text Available Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long- and short-term neutron and gamma source is carried out whereas in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.

  2. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    Science.gov (United States)

    Kooymana, Timothée; Buiron, Laurent; Rimpault, Gérald

    2017-09-01

    Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long and short term neutron and gamma source is carried out while in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.

  3. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    International Nuclear Information System (INIS)

    Kooyman, T.; Buiron, L.; Rimpault, G.

    2017-01-01

    Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long- and short-term neutron and gamma source is carried out whereas in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing. (authors)

  4. Reliability of MEG source imaging of anterior temporal spikes: analysis of an intracranially characterized spike focus.

    Science.gov (United States)

    Wennberg, Richard; Cheyne, Douglas

    2014-05-01

    To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-07-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), carbon monoxide (CO), ozone (O{sub 3}), aerosol scattering coefficient ({sigma}{sub sp}), aerosol number concentration (NC{sub asl}), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on

  6. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    International Nuclear Information System (INIS)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-01-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO 2 ), methane (CH 4 ), carbon monoxide (CO), ozone (O 3 ), aerosol scattering coefficient (σ sp ), aerosol number concentration (NC asl ), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on atmospheric composition in the Arctic. We

  7. Analysis of the emission characteristics of ion sources for high-value optical counting processes

    International Nuclear Information System (INIS)

    Beermann, Nils

    2009-01-01

    The production of complex high-quality thin film systems requires a detailed understanding of all partial processes. One of the most relevant partial processes is the condensation of the coating material on the substrate surface. The optical and mechanical material properties can be adjusted by the well-defined impingement of energetic ions during deposition. Thus, in the past, a variety of different ion sources were developed. With respect to the present and future challenges in the production of precisely fabricated high performance optical coatings, the ion emission of the sources has commonly not been characterized sufficiently so far. This question is addressed in the frame of this work which itself is thematically integrated in the field of process-development and -control of ion assisted deposition processes. In a first step, a Faraday cup measurement system was developed which allows the spatially resolved determination of the ion energy distribution as well as the ion current distribution. Subsequently, the ion emission profiles of six ion sources were determined depending on the relevant operating parameters. Consequently, a data pool for process planning and supplementary process analysis is made available. On the basis of the acquired results, the basic correlations between the operating parameters and the ion emission are demonstrated. The specific properties of the individual sources as well as the respective control strategies are pointed out with regard to the thin film properties and production yield. Finally, a synthesis of the results and perspectives for future activities are given. (orig.)

  8. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  9. Preliminary radiation transport analysis for the proposed National Spallation Neutron Source (NSNS)

    International Nuclear Information System (INIS)

    Johnson, J.O.; Lillie, R.A.

    1997-01-01

    The use of neutrons in science and industry has increased continuously during the past 50 years with applications now widely used in physics, chemistry, biology, engineering, and medicine. Within this history, the relative merits of using pulsed accelerator spallation sources versus reactors for neutron sources as the preferred option for the future. To address this future need, the Department of Energy (DOE) has initiated a pre-conceptual design study for the National Spallation Neutron Source (NSNS) and given preliminary approval for the proposed facility to be built at Oak Ridge National Laboratory (ORNL). The DOE directive is to design and build a short pulse spallation source in the 1 MS power range with sufficient design flexibility that it can be upgraded and operated at a significantly higher power at a later stage. The pre-conceptualized design of the NSNS initially consists of an accelerator system capable of delivering a 1 to 2 GeV proton beam with 1 MW of beam power in an approximate 0.5 microsecond pulse at a 60 Hz frequency onto a single target station. The NSNS will be upgraded in stages to a 5 MW facility with two target stations (a high power station operating at 60 Hz and a low power station operating at 10 Hz). Each target station will contain four moderators (combinations of cryogenic and ambient temperature) and 18 beam liens for a total of 36 experiment stations. This paper summarizes the radiation transport analysis strategies for the proposed NSNS facility

  10. Visualization of NO2 emission sources using temporal and spatial pattern analysis in Asia

    Science.gov (United States)

    Schütt, A. M. N.; Kuhlmann, G.; Zhu, Y.; Lipkowitsch, I.; Wenig, M.

    2016-12-01

    Nitrogen dioxide (NO2) is an indicator for population density and level of development, but the contributions of the different emission sources to the overall concentrations remains mostly unknown. In order to allocate fractions of OMI NO2 to emission types, we investigate several temporal cycles and regional patterns.Our analysis is based on daily maps of tropospheric NO2 vertical column densities (VCDs) from the Ozone Monitoring Instrument (OMI). The data set is mapped to a high resolution grid by a histopolation algorithm. This algorithm is based on a continuous parabolic spline, producing more realistic smooth distributions while reproducing the measured OMI values when integrating over ground pixel areas.In the resulting sequence of zoom in maps, we analyze weekly and annual cycles for cities, countryside and highways in China, Japan and Korea Republic and look for patterns and trends and compare the derived results to emission sources in Middle Europe and North America. Due to increased heating in winter compared to summer and more traffic during the week than on Sundays, we dissociate traffic, heating and power plants and visualized maps with different sources. We will also look into the influence of emission control measures during big events like the Olympic Games 2008 and the World Expo 2010 as a possibility to confirm our classification of NO2 emission sources.

  11. Analysis of geological material and especially ores by means of a 252Cf source

    International Nuclear Information System (INIS)

    Barrandon, J.N.; Borderie, B.; Melky, S.; Halfon, J.; Marce, A.

    1976-01-01

    Tests were made on the possibilities for analysis by 252 Cf activation in the earth sciences and mining research. The results obtained show that while 252 Cf activation can only resolve certain very specific geochemical research problems, it does allow the exact and rapid determination of numerous elements whose ores are of great economic importance such as fluorine, titanium, vanadium, manganese, copper, antimony, barium, and tungsten. The utilization of activation analysis methods in the earth sciences is not a recent phenomenon. It has generally been limited to the analysis of traces in relatively small volumes by means of irradiation in nuclear reactors. Traditional neutron sources were little used and were not very applicable. The development of 252 Cf isotopic sources emitting more intense neutron fluxes make it possible to consider carrying out more sensitive determinations without making use of a nuclear reactor. In addition, this technique can be adapted for in situ analysis in mines and mine borings. Our work which is centered upon the possibilities of instrumental laboratory analyses of geological materials through 252 Cf activation is oriented in two principal directions: the study of the experimental sensitivities of the various elements in different rocks with the usual compositions; and the study of the possibilities for routine ore analyses

  12. Determination of volatile organic compounds pollution sources in malaysian drinking water using multivariate analysis.

    Science.gov (United States)

    Soh, Shiau-Chian; Abdullah, Md Pauzi

    2007-01-01

    A field investigation was conducted at all water treatment plants throughout 11 states and Federal Territory in Peninsular Malaysia. The sampling points in this study include treatment plant operation, service reservoir outlet and auxiliary outlet point at the water pipelines. Analysis was performed by solid phase micro-extraction technique with a 100 microm polydimethylsiloxane fibre using gas chromatography with mass spectrometry detection to analyse 54 volatile organic compounds (VOCs) of different chemical families in drinking water. The concentration of VOCs ranged from undetectable to 230.2 microg/l. Among all of the VOCs species, chloroform has the highest concentration and was detected in all drinking water samples. Average concentrations of total trihalomethanes (THMs) were almost similar among all states which were in the range of 28.4--33.0 microg/l. Apart from THMs, other abundant compounds detected were cis and trans-1,2-dichloroethylene, trichloroethylene, 1,2-dibromoethane, benzene, toluene, ethylbenzene, chlorobenzene, 1,4-dichlorobenzene and 1,2-dichloro - benzene. Principal component analysis (PCA) with the aid of varimax rotation, and parallel factor analysis (PARAFAC) method were used to statistically verify the correlation between VOCs and the source of pollution. The multivariate analysis pointed out that the maintenance of auxiliary pipelines in the distribution systems is vital as it can become significant point source pollution to Malaysian drinking water.

  13. Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries)

    Science.gov (United States)

    Amorim, Inês; Sousa Silva, Luís; Garcia, João Carlos

    2017-04-01

    Critical analysis of documentary sources for Historical Climatology of Northern Portugal (17th-19th centuries) Inês Amorim CITCEM, Department of History, Political and International Studies, U. of Porto, Portugal. Luís Sousa Silva CITCEM, PhD Fellowship - FCT. João Carlos Garcia CIUHCT, Geography Department, U. of Porto, Portugal. The first major national project on Historical Climatology in Portugal, called "KLIMHIST: Reconstruction and model simulations of past climate in Portugal using documentary and early instrumental sources (17th-19th centuries)", ended in September 2015, coordinated by Maria João Alcoforado. This project began in March 2012 and counted on an interdisciplinary team of researchers from four Portuguese institutions (Centre of Geographical Studies, University of Trás-os-Montes and Alto Douro, University of Porto, and University of Évora), from different fields of knowledge (Geography, History, Biology, Climatology and Meteorology). The team networked and collaborated with other international research groups on Climate Change and Historical Climatology, resulting in several publications. This project aimed to reconstruct thermal and rainfall patterns in Portugal between the 17th and 19th centuries, as well as identify the main hydrometeorological extremes that occurred over that period. The basic methodology consisted in combining information from different types of anthropogenic sources (descriptive and instrumental) and natural sources (tree rings and geothermal holes), so as to develop climate change models of the past. The data collected were stored in a digital database, which can be searched by source, date, location and type of event. This database, which will be made publically available soon, contains about 3500 weather/climate-related records, which have begun to be studied, processed and published. Following this seminal project, other initiatives have taken place in Portugal in the area of Historical Climatology, namely a Ph

  14. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  15. Design and analysis of nuclear battery driven by the external neutron source

    International Nuclear Information System (INIS)

    Wang, Sanbing; He, Chaohui

    2014-01-01

    Highlights: • A new type of space nuclear power called NBDEx is investigated. • NBDEx with 252 Cf has better performance than RTG with similar structure. • Its thermal power gets great improvement with increment of fuel enrichment. • The service life of NBDEx is about 2.96 year. • The launch abortion accident analysis fully demonstrates the advantage of NBDEx. - Abstract: Based on the theory of ADS (Accelerator Driven Subcritical reactor), a new type of nuclear battery was investigated, which was composed of a subcritical fission module and an isotope neutron source, called NBDEx (Nuclear Battery Driven by External neutron source). According to the structure of GPHS-RTG (General Purpose Heat Source Radioisotope Thermoelectric Generator), the fuel cell model and fuel assembly model of NBDEx were set up, and then their performances were analyzed with MCNP code. From these results, it was found that the power and power density of NBDEx were almost six times higher than the RTG’s. For fully demonstrating the advantage of NBDEx, the analysis of its impact factors was performed with MCNP code, and its lifetime was also calculated using the Origen code. These results verified that NBDEx was more suitable for the space missions than RTG

  16. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    Science.gov (United States)

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  17. School adjustment of children in residential care: a multi-source analysis.

    Science.gov (United States)

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  18. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  19. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Helmus, Jonathan J., E-mail: jjhelmus@gmail.com [Argonne National Laboratory, Environmental Science Division (United States); Jaroniec, Christopher P., E-mail: jaroniec@chemistry.ohio-state.edu [Ohio State University, Department of Chemistry and Biochemistry (United States)

    2013-04-15

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  20. Assessing heavy metal sources in sugarcane Brazilian soils: an approach using multivariate analysis.

    Science.gov (United States)

    da Silva, Fernando Bruno Vieira; do Nascimento, Clístenes Williams Araújo; Araújo, Paula Renata Muniz; da Silva, Luiz Henrique Vieira; da Silva, Roberto Felipe

    2016-08-01

    Brazil is the world's largest sugarcane producer and soils in the northeastern part of the country have been cultivated with the crop for over 450 years. However, so far, there has been no study on the status of heavy metal accumulation in these long-history cultivated soils. To fill the gap, we collect soil samples from 60 sugarcane fields in order to determine the contents of Cd, Cr, Cu, Ni, Pb, and Zn. We used multivariate analysis to distinguish between natural and anthropogenic sources of these metals in soils. Analytical determinations were performed in ICP-OES after microwave acid solution digestion. Mean concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were 1.9, 18.8, 6.4, 4.9, 11.2, and 16.2 mg kg(-1), respectively. The principal component one was associated with lithogenic origin and comprised the metals Cr, Cu, Ni, and Zn. Cluster analysis confirmed that 68 % of the evaluated sites have soil heavy metal concentrations close to the natural background. The Cd concentration (principal component two) was clearly associated with anthropogenic sources with P fertilization being the most likely source of Cd to soils. On the other hand, the third component (Pb concentration) indicates a mixed origin for this metal (natural and anthropogenic); hence, Pb concentrations are probably related not only to the soil parent material but also to industrial emissions and urbanization in the vicinity of the agricultural areas.

  1. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    Science.gov (United States)

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  2. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    International Nuclear Information System (INIS)

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  3. Jet flow analysis of liquid poison injection in a CANDU reactor using source term

    International Nuclear Information System (INIS)

    Chae, Kyung Myung; Choi, Hang Bok; Rhee, Bo Wook

    2001-01-01

    For the performance analysis of Canadian deuterium uranium (CANDU) reactor shutdown system number 2 (SDS2), a computational fluid dynamics model of poison jet flow has been developed to estimate the flow field and poison concentration formed inside the CANDU reactor calandria. As the ratio of calandria shell radius over injection nozzle hole diameter is so large (1055), it is impractical to develop a full-size model encompassing the whole calandria shell. In order to reduce the model to a manageable size, a quarter of one-pitch length segment of the shell was modeled using symmetric nature of the jet; and the injected jet was treated as a source term to avoid the modeling difficulty caused by the big difference of the hole sizes. For the analysis of an actual CANDU-6 SDS2 poison injection, the grid structure was determined based on the results of two-dimensional real- and source-jet simulations. The maximum injection velocity of the liquid poison is 27.8 m/s and the mass fraction of the poison is 8000 ppm (mg/kg). The simulation results have shown well-established jet flow field. In general, the jet develops narrowly at first but stretches rapidly. Then, the flow recirculates a little in r-x plane, while it recirculates largely in r-θ plane. As the time goes on, the adjacent jets contact each other and form a wavy front such that the whole jet develops in a plate form. his study has shown that the source term model can be effectively used for the analysis of the poison injection and the simulation result of the CANDU reactor is consistent with the model currently being used for the safety analysis. In the future, it is strongly recommended to analyze the transient (from helium tank to injection nozzle hole) of the poison injection by applying Bernoulli equation with real boundary conditions

  4. Molecular evolution in court: analysis of a large hepatitis C virus outbreak from an evolving source.

    Science.gov (United States)

    González-Candelas, Fernando; Bracho, María Alma; Wróbel, Borys; Moya, Andrés

    2013-07-19

    Molecular phylogenetic analyses are used increasingly in the epidemiological investigation of outbreaks and transmission cases involving rapidly evolving RNA viruses. Here, we present the results of such an analysis that contributed to the conviction of an anesthetist as being responsible for the infection of 275 of his patients with hepatitis C virus. We obtained sequences of the NS5B and E1-E2 regions in the viral genome for 322 patients suspected to have been infected by the doctor, and for 44 local, unrelated controls. The analysis of 4,184 cloned sequences of the E1-E2 region allowed us to exclude 47 patients from the outbreak. A subset of patients had known dates of infection. We used these data to calibrate a relaxed molecular clock and to determine a rough estimate of the time of infection for each patient. A similar analysis led to an estimate for the time of infection of the source. The date turned out to be 10 years before the detection of the outbreak. The number of patients infected was small at first, but it increased substantially in the months before the detection of the outbreak. We have developed a procedure to integrate molecular phylogenetic reconstructions of rapidly evolving viral populations into a forensic setting adequate for molecular epidemiological analysis of outbreaks and transmission events. We applied this procedure to a large outbreak of hepatitis C virus caused by a single source and the results obtained played a key role in the trial that led to the conviction of the suspected source.

  5. Open source EMR software: profiling, insights and hands-on analysis.

    Science.gov (United States)

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    Science.gov (United States)

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  7. Image acquisition and analysis for beam diagnostics, applications of the Taiwan photon source

    International Nuclear Information System (INIS)

    Liao, C.Y.; Chen, J.; Cheng, Y.S.; Hsu, K.T.; Hu, K.H.; Kuo, C.H.; Wu, C.Y.

    2012-01-01

    Design and implementation of image acquisition and analysis is in proceeding for the Taiwan Photon Source (TPS) diagnostic applications. The optical system contains screen, lens, and lighting system. A CCD camera with Gigabit Ethernet interface (GigE Vision) will be a standard image acquisition device. Image acquisition will be done on EPICS IOC via PV channel and analysis the properties by using Matlab tool to evaluate the beam profile (sigma), beam size position and tilt angle et al. The EPICS IOC integrated with Matlab as a data processing system is not only could be used in image analysis but also in many types of equipment data processing applications. Progress of the project will be summarized in this report. (authors)

  8. International patent analysis of water source heat pump based on orbit database

    Science.gov (United States)

    Li, Na

    2018-02-01

    Using orbit database, this paper analysed the international patents of water source heat pump (WSHP) industry with patent analysis methods such as analysis of publication tendency, geographical distribution, technology leaders and top assignees. It is found that the beginning of the 21st century is a period of rapid growth of the patent application of WSHP. Germany and the United States had done researches and development of WSHP in an early time, but now Japan and China have become important countries of patent applications. China has been developing faster and faster in recent years, but the patents are concentrated in universities and urgent to be transferred. Through an objective analysis, this paper aims to provide appropriate decision references for the development of domestic WSHP industry.

  9. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    International Nuclear Information System (INIS)

    Summerhayes, G.R.; Gosden, C.; Bird, R.; Hotchkis, M.; Specht, J.; Torrence, R.; Fullaga, R.

    1993-01-01

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills

  10. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G R; Gosden, C [La Trobe Univ., Bundoora, VIC (Australia); Bird, R; Hotchkis, M [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J; Torrence, R; Fullaga, R [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1994-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  11. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G.R.; Gosden, C. [La Trobe Univ., Bundoora, VIC (Australia); Bird, R.; Hotchkis, M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J.; Torrence, R.; Fullaga, R. [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1993-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  12. SWOT Analysis and Related Countermeasures for Croatia to Explore the Chinese Tourist Source Market

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2017-08-01

    Full Text Available Croatia is a land endowed with rich and diversified natural and cultural tourist resources. Traveling around Croatia, I was stunned by its beauty. However, I noticed that there were few Chinese tourists in Croatia. How can we bring more Chinese tourists to Croatia? How can we make them happy and comfortable in Croatia? And, at the same time, how can we avoid polluting this tract of pure land? Based on first-hand research work, I make a SWOT analysis of the Chinese tourist source market of Croatia and put forward related countermeasures from the perspective of a native Chinese. The positioning of tourism in Croatia should be ingeniously packaged. I recommend developing diversified and specialized tourist products, various marketing and promotional activities, simple and flexible visa policies and regulations, and other related measures to further explore the Chinese tourist source market of Croatia.

  13. Problems of accuracy and sources of error in trace analysis of elements

    International Nuclear Information System (INIS)

    Porat, Ze'ev.

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,

  14. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  15. Analysis of insertion device magnet measurements for the Advanced Light Source

    International Nuclear Information System (INIS)

    Marks, S.; Humphries, D.; Kincaid, B.M.; Schlueter, R.; Wang, C.

    1993-07-01

    The Advanced Light Source (ALS), which is currently being commissioned at Lawrence Berkeley Laboratory, is a third generation light source designed to produce XUV radiation of unprecedented brightness. To meet the high brightness goal the storage ring has been designed for very small electron beam emittance and the undulators installed in the ALS are built to a high degree of precision. The allowable magnetic field errors are driven by electron beam and radiation requirements. Detailed magnetic measurements and adjustments are performed on each undulator to qualify it for installation in the ALS. The first two ALS undulators, IDA and IDB, have been installed. This paper describes the program of measurements, data analysis, and adjustments carried out for these two devices. Calculations of the radiation spectrum, based upon magnetic measurements, are included. Final field integral distributions are also shown. Good field integral uniformity has been achieved using a novel correction scheme, which is also described

  16. Problems of accuracy and sources of error in trace analysis of elements

    Energy Technology Data Exchange (ETDEWEB)

    Porat, Ze` ev

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,.

  17. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  18. Analysis of internal radiation and radiotoxicity source base on aerosol distribution in RMI

    International Nuclear Information System (INIS)

    Yuwono, I.

    2000-01-01

    Destructive testing of nuclear fuel element during post irradiation examination in radio metallurgy installation may cause air contamination in the working area in the form of radioactive aerosol. Inhalation of the radioactive aerosol by worker will to become internal radiation source. Potential hazard of radioactive particle in the body also depends on the particle size. Analysis of internal radiation source and radiotoxicity showed that in the normal operation only natural radioactive materials are found with high radiotoxicity, i.e. Pb-212 and Ac-228. High deposit in the alveolar instersial (Ai) is 95 % and lower in the bronchial area (BB) is 1 % for particle size 11.7 nm and 350 nm respectively. (author)

  19. Economic analysis for the electricity production in isolated areas in Cuba using different renewable sources

    International Nuclear Information System (INIS)

    Morales Salas, Joel; Moreno Figueredo, Conrado; Briesemeister, Ludwig; Arzola, Jose

    2015-01-01

    Despite the effort and commitment of the Cuban government in more of 50 year, there are houses without electricity in remote areas of the Electricity Network. These houses or communities have the promise and commitment of the local and national authorities to help them in improve his life quality. How the houses and communities are remote of the electricity network, the cost to extend the network is considerably high. For that reason, the use of renewable sources in these areas is an acceptable proposal. This article does an analysis to obtain different configurations depending to the number of houses. It do a proposal with the use of the Hydrothermal Carbonization process in the cases where is not feasible introduce different renewable source; a technology new in Cuba, and advantageous taking into consideration the kind of biomass that exist in Cuba. The study of the chemical process of the Hydrothermal Carbonization with the Cuban biomass should be further researched. (full text)

  20. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  1. Phenotypic and genotypic analysis of bio-serotypes of Yersinia enterocolitica from various sources in Brazil.

    Science.gov (United States)

    Rusak, Leonardo Alves; dos Reis, Cristhiane Moura Falavina; Barbosa, André Victor; Santos, André Felipe Mercês; Paixão, Renata; Hofer, Ernesto; Vallim, Deyse Christina; Asensi, Marise Dutra

    2014-12-15

    Yersinia enterocolitica is a well-known foodborne pathogen widely distributed in nature with high public health relevance, especially in Europe. This study aimed to analyze the pathogenic potential of Y. enterocolitica isolated strains from human, animal, food, and environmental sources and from different regions of Brazil by detecting virulence genes inv, ail, ystA, and virF through polymerase chain reaction (PCR), phenotypic tests, and antimicrobial susceptibility analysis. Pulsed-field gel electrophoresis (PFGE) was used for the assessment of phylogenetic diversity. All virulence genes were detected in 11/60 (18%) strains of serotype O:3, biotype 4 isolated from human and animal sources. Ten human strains (4/O:3) presented three chromosomal virulence genes, and nine strains of biotype 1A presented the inv gene. Six (10%) strains were resistant to sulfamethoxazole-trimethoprim, seven (12%) to tetracycline, and one (2%) to amikacin, all of which are used to treat yersiniosis. AMP-CEF-SXT was the predominant resistance profile. PFGE analysis revealed 36 unique pulsotypes, grouped into nine clusters (A to I) with similarity ≥ 85%, generating a diversity discriminatory index of 0.957. Cluster A comprised all bio-serotype 4/O:3 strains isolated from animal and humans sources. This study shows the existence of strains with the same genotypic profiles, bearing all virulence genes, from human and animal sources, circulating among several Brazilian states. This supports the hypothesis that swine is likely to serve as a main element in Y. enterocolitica transmission to humans in Brazil, and it could become a potential threat to public health as in Europe.

  2. Source apportionment analysis of atmospheric particulates in an industrialised urban site in southwestern Spain

    International Nuclear Information System (INIS)

    Querol, X.; Alastuey, A.; Sanchez-de-la-Campa, A.; Plana, F.; Ruiz, C.R.; Rosa, J. de la

    2002-01-01

    A detailed physical and chemical characterisation of total suspended particles (TSP) in the highly industrialised city of Huelva (southwestern Spain) was carried out. The results evidenced a coarse grain-size prevalence (PM 10 accounting for only 40% of TSP mass, 37 and 91 μg/m 3 , respectively). PM 10 levels are in the usual range for urban background sites in Spain. The crustal, anthropogenic and marine components accounted for a mean of a 40%, 24% and 5% of bulk TSP, respectively. As expected from the industrial activities, relatively high PO 4 3- and As levels for an urban site were detected. In addition to the crustal and marine components, source apportionment analysis revealed three additional emission sources influencing the levels and composition of TSP: (a) a petrochemical source, (b) a mixed metallurgical-phosphate source, (c) and an unknown source (Sb and NO 3 - ). Due to the high local emissions, the mean TSP anthropogenic contribution (mostly PM 10 ) obtained for all possible air mass transport scenarios reached 18-29 μg/m 3 . The 2010 annual EU PM 10 limit value (20 μg/m 3 ) would be exceeded by the anthropogenic load recorded for all the air mass transport scenarios, with the exception of the North Atlantic transport (only 15% of the sampling days). Under African air mass transport scenarios (20% of sampling days), the TSP crustal contribution reached near three times the local crustal contribution. It must be pointed out that this crustal input should diminish when sampling PM 10 due to the dominant coarse size distribution of this type of particles. (author)

  3. Source attribution of Bornean air masses by back trajectory analysis during the OP3 project

    Directory of Open Access Journals (Sweden)

    N. H. Robinson

    2011-09-01

    Full Text Available Atmospheric composition affects the radiative balance of the Earth through the creation of greenhouse gases and the formation of aerosols. The latter interact with incoming solar radiation, both directly and indirectly through their effects on cloud formation and lifetime. The tropics have a major influence on incoming sunlight however the tropical atmosphere is poorly characterised, especially outside Amazonia. The origins of air masses influencing a measurement site in a protected rainforest in Borneo, South East Asia, were assessed and the likely sources of a range of trace gases and particles were determined. This was conducted by interpreting in situ measurements made at the site in the context of ECMWF backwards air mass trajectories. Two different but complementary methods were employed to interpret the data: comparison of periods classified by cluster analysis of trajectories, and inspection of the dependence of mean measured values on geographical history of trajectories. Sources of aerosol particles, carbon monoxide and halocarbons were assessed. The likely source influences include: terrestrial organic biogenic emissions; long range transport of anthropogenic emissions; biomass burning; sulphurous emissions from marine phytoplankton, with a possible contribution from volcanoes; marine production of inorganic mineral aerosol; and marine production of halocarbons. Aerosol sub- and super-saturated water affinity was found to be dependent on source (and therefore composition, with more hygroscopic aerosol and higher numbers of cloud condensation nuclei measured in air masses of marine origin. The prevailing sector during the majority of measurements was south-easterly, which is from the direction of the coast closest to the site, with a significant influence inland from the south-west. This analysis shows that marine and terrestrial air masses have different dominant chemical sources. Comparison with the AMAZE-08 project in the Amazon

  4. Meta-analysis on Methane Mitigating Properties of Saponin-rich Sources in the Rumen: Influence of Addition Levels and Plant Sources

    Directory of Open Access Journals (Sweden)

    Anuraga Jayanegara

    2014-10-01

    Full Text Available Saponins have been considered as promising natural substances for mitigating methane emissions from ruminants. However, studies reported that addition of saponin-rich sources often arrived at contrasting results, i.e. either it decreased methane or it did not. The aim of the present study was to assess ruminal methane emissions through a meta-analytical approach of integrating related studies from published papers which described various levels of different saponin-rich sources being added to ruminant feed. A database was constructed from published literature reporting the addition of saponin-rich sources at various levels and then monitoring ruminal methane emissions in vitro. Accordingly, levels of saponin-rich source additions as well as different saponin sources were specified in the database. Apart from methane, other related rumen fermentation parameters were also included in the database, i.e. organic matter digestibility, gas production, pH, ammonia concentration, short-chain fatty acid profiles and protozoal count. A total of 23 studies comprised of 89 data points met the inclusion criteria. The data obtained were subsequently subjected to a statistical meta-analysis based on mixed model methodology. Accordingly, different studies were treated as random effects whereas levels of saponin-rich source additions or different saponin sources were considered as fixed effects. Model statistics used were p-value and root mean square error. Results showed that an addition of increasing levels of a saponin-rich source decreased methane emission per unit of substrate incubated as well as per unit of total gas produced (ptea>quillaja, statistically they did not differ each other. It can be concluded that methane mitigating properties of saponins in the rumen are level- and source-dependent.

  5. MADAM - An open source meta-analysis toolbox for R and Bioconductor

    Directory of Open Access Journals (Sweden)

    Graber Armin

    2010-03-01

    Full Text Available Abstract Background Meta-analysis is a major theme in biomedical research. In the present paper we introduce a package for R and Bioconductor that provides useful tools for performing this type of work. One idea behind the development of MADAM was that many meta-analysis methods, which are available in R, are not able to use the capacities of parallel computing yet. In this first version, we implemented one meta-analysis method in such a parallel manner. Additionally, we provide tools for combining the results from a set of methods in an ensemble approach. Functionality for visualization of results is also provided. Results The presented package enables the carrying out of meta-analysis either by providing functions directly or by wrapping them to existing implementations. Overall, five different meta-analysis methods are now usable through MADAM, along with another three methods for combining the corresponding results. Visualizing the results is eased by three included functions. For developing and testing meta-analysis methods, a mock up data generator is integrated. Conclusions The use of MADAM enables a user to focus on one package, in turn enabling them to work with the same data types across a set of methods. By making use of the snow package, MADAM can be made compatible with an existing parallel computing infrastructure. MADAM is open source and freely available within CRAN http://cran.r-project.org.

  6. Modeling and reliability analysis of three phase z-source AC-AC converter

    Directory of Open Access Journals (Sweden)

    Prasad Hanuman

    2017-12-01

    Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.

  7. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  8. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-01-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  9. Polycyclic aromatic hydrocarbons in urban air : concentration levels and patterns and source analysis in Nairobi, Kenya

    Energy Technology Data Exchange (ETDEWEB)

    Muthini, M.; Yoshimichi, H.; Yutaka, K.; Shigeki, M. [Yokohama National Univ., Yokohama (Japan). Graduate School of Environment and Information Sciences

    2005-07-01

    Polycyclic aromatic hydrocarbons (PAHs) present in the environment are often the result of incomplete combustion processes. This paper reported concentration levels and patterns of high molecular weight PAHs in Nairobi, Kenya. Daily air samples for 30 different PAHs were collected at residential, industrial and business sites within the city. Samples were then extracted using deuterated PAH with an automated Soxhlet device. Gas chromatography and mass spectrometry (GC-MS) with a capillary column was used to analyze the extracts using a selected ion monitoring (SIM) mode. Statistical analyses were then performed. PAH concentration levels were reported for average, median, standard deviation, range, and Pearson's correlation coefficients. Data were then analyzed for sources using a principal component analysis (PCA) technique and isomer ratio analysis. Nonparametric testing was then conducted to detect inherent differences in PAH concentration data obtained from the different sites. Results showed that pyrene was the most abundant PAH. Carcinogenic PAHs were higher in high-traffic areas. The correlation coefficient between coronene and benzo(ghi)pyrene was high. The PAH isomer ratio analysis demonstrated that PAHs in Nairobi are the product of traffic emissions and oil combustion. Results also showed that PAH profiles were not well separated. It was concluded that source distinction methods must be improved in order to better evaluate PAH emissions in the city. 9 refs., 2 tabs., 1 fig.

  10. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  11. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  12. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  13. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    International Nuclear Information System (INIS)

    Park, Jin Beak; Park, Joo-Wan; Lee, Eun-Young; Kim, Chang-Lak

    2003-01-01

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation of the Korean concept of the LILW disposal project in the near future

  14. ELATE: an open-source online application for analysis and visualization of elastic tensors

    International Nuclear Information System (INIS)

    Gaillac, Romain; Coudert, François-Xavier; Pullumbi, Pluton

    2016-01-01

    We report on the implementation of a tool for the analysis of second-order elastic stiffness tensors, provided with both an open-source Python module and a standalone online application allowing the visualization of anisotropic mechanical properties. After describing the software features, how we compute the conventional elastic constants and how we represent them graphically, we explain our technical choices for the implementation. In particular, we focus on why a Python module is used to generate the HTML web page with embedded Javascript for dynamical plots. (paper)

  15. Source term analysis for a criticality accident in metal production line glove boxes

    International Nuclear Information System (INIS)

    Nguyen, D.H.

    1991-06-01

    A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion calculations. We have used an integrated computer model to determine the release of fission products to the environment from a hypothetical criticality event in a glove box of the metal production line (MPL) at the Lawrence Livermore National Laboratory (LLNL)

  16. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  17. Fiji: an open-source platform for biological-image analysis.

    Science.gov (United States)

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  18. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  19. Security analysis of an untrusted source for quantum key distribution: passive approach

    International Nuclear Information System (INIS)

    Zhao Yi; Qi Bing; Lo, H-K; Qian Li

    2010-01-01

    We present a passive approach to the security analysis of quantum key distribution (QKD) with an untrusted source. A complete proof of its unconditional security is also presented. This scheme has significant advantages in real-life implementations as it does not require fast optical switching or a quantum random number generator. The essential idea is to use a beam splitter to split each input pulse. We show that we can characterize the source using a cross-estimate technique without active routing of each pulse. We have derived analytical expressions for the passive estimation scheme. Moreover, using simulations, we have considered four real-life imperfections: additional loss introduced by the 'plug and play' structure, inefficiency of the intensity monitor noise of the intensity monitor, and statistical fluctuation introduced by finite data size. Our simulation results show that the passive estimate of an untrusted source remains useful in practice, despite these four imperfections. Also, we have performed preliminary experiments, confirming the utility of our proposal in real-life applications. Our proposal makes it possible to implement the 'plug and play' QKD with the security guaranteed, while keeping the implementation practical.

  20. Analysis of the Potential of Low-Temperature Heat Pump Energy Sources

    Directory of Open Access Journals (Sweden)

    Pavel Neuberger

    2017-11-01

    Full Text Available The paper deals with an analysis of temperatures of ground masses in the proximities of linear and slinky-type HGHE (horizontal ground heat exchanger. It evaluates and compares the potentials of HGHEs and ambient air. The reason and aim of the verification was to gain knowledge of the temperature course of the monitored low-temperature heat pump energy sources during heating periods and periods of stagnation and to analyse the knowledge in terms of the potential to use those sources for heat pumps. The study was conducted in the years 2012–2015 during three heating periods and three periods of HGHEs stagnation. The results revealed that linear HGHE had the highest temperature potential of the observed low-temperature heat pump energy sources. The average daily temperatures of the ground mass surrounding the linear HGHE were the highest ranging from 7.08 °C to 9.20 °C during the heating periods, and having the lowest temperature variation range of 12.62–15.14 K, the relative frequency of the average daily temperatures of the ground mass being the highest at 22.64% in the temperature range containing the mode of all monitored temperatures in a recorded interval of [4.10, 6.00] °C. Ambient air had lower temperature potential than the monitored HGHEs.

  1. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    Directory of Open Access Journals (Sweden)

    J. Walawender

    2008-04-01

    Full Text Available Geographical Information Systems (GIS can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data.

    The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data.

    Three selected days (30 cases with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  2. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  3. Preparation of a Co-57 Moessbauer source for applications in analysis of iron compounds

    International Nuclear Information System (INIS)

    Gonzalez-Ramirez, R.

    1990-01-01

    A report is presented on the preparation of a 57-Co low activity mossbauer source in a stainless steel matrix, which may be used for both demonstration experiments and some simple analysis work. This kind of sources are available in the market and there is not a general agreement on the particular conditions of preparations. Three series of experiments were performed to find out the best conditions to electro deposit 59, 60, 57-Co respectively on a stainless steel foil 25 μM thick and 1 Cm 2 area. The electrolyte contained Co(NO 3 ) 2 in a buffer solution to control the ph in the range 8.5 2 . Once the best conditions to electrodeposit 57-Co were found, it was diffused into the stainless steel matrix by annealing at 1100 o C for three hours and then gradually cooled down to room temperature in two hours; all this was done in the presence of an argon flow. Lastly, a 15 μCi 57-Co Mossbauer source in stainless steel matrix was obtained and used to record a series of Mossbauer parameters of this spectra were in close agreement with those given in the literature. (Author)

  4. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  5. Analysis of source spectra, attenuation, and site effects from central and eastern United States earthquakes

    International Nuclear Information System (INIS)

    Lindley, G.

    1998-02-01

    This report describes the results from three studies of source spectra, attenuation, and site effects of central and eastern United States earthquakes. In the first study source parameter estimates taken from 27 previous studies were combined to test the assumption that the earthquake stress drop is roughly a constant, independent of earthquake size. 200 estimates of stress drop and seismic moment from eastern North American earthquakes were combined. It was found that the estimated stress drop from the 27 studies increases approximately as the square-root of the seismic moment, from about 3 bars at 10 20 dyne-cm to 690 bars at 10 25 dyne-cm. These results do not support the assumption of a constant stress drop when estimating ground motion parameters from eastern North American earthquakes. In the second study, broadband seismograms recorded by the United States National Seismograph Network and cooperating stations have been analysed to determine Q Lg as a function of frequency in five regions: the northeastern US, southeastern US, central US, northern Basin and Range, and California and western Nevada. In the third study, using spectral analysis, estimates have been made for the anelastic attenuation of four regional phases, and estimates have been made for the source parameters of 27 earthquakes, including the M b 5.6, 14 April, 1995, West Texas earthquake

  6. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  7. Performance analysis and experimental study of heat-source tower solution regeneration

    International Nuclear Information System (INIS)

    Liang, Caihua; Wen, Xiantai; Liu, Chengxing; Zhang, Xiaosong

    2014-01-01

    Highlights: • Theoretical analysis is performed on the characteristics of heat-source tower. • Experimental study is performed on various rules of the solution regeneration rate. • The characteristics of solution regeneration vary widely with different demands. • Results are useful for optimizing the process of solution regeneration. - Abstract: By analyzing similarities and difference between the solution regeneration of a heat-source tower and desiccant solution regeneration, this paper points out that solution regeneration of a heat-source tower has the characteristics of small demands and that a regeneration rate is susceptible to outdoor ambient environments. A theoretical analysis is performed on the characteristics of a heat-source tower solution in different outdoor environments and different regeneration modes, and an experimental study is performed on variation rules of the solution regeneration rate of a cross-flow heat-source tower under different inlet parameters and operating parameters. The experimental results show that: in the operating regeneration mode, as the air volume was increased from 123 m 3 h −1 to 550 m 3 h −1 , the system heat transfer amount increased from 0.42 kW to 0.78 kW, and the regeneration rate increased from 0.03 g s −1 to 0.19 g s −1 . Increasing the solution flow may increase the system heat transfer amount; however, the regeneration rate decreased to a certain extent. In the regeneration mode when the system is idle, as the air volume was increased from 136 m 3 h −1 to 541 m 3 h −1 , the regeneration rate increased from 0.03 g s −1 to 0.1 g s −1 . The regeneration rate almost remained unchanged around 0.07 g s −1 as the solution flow is increased. In the regeneration mode with auxiliary heat when the system is idle, increasing the air volume and increasing the solution flow required more auxiliary heat, thereby improving the solution regeneration rate. As the auxiliary heat was increased from 0.33 k

  8. Vibration analysis of the photon shutter designed for the advanced photon source

    International Nuclear Information System (INIS)

    Wang, Z.; Shu, D.; Kuzay, T.M.

    1992-01-01

    The photon shutter is a critical component of the beamline front end for the 7 GeV Advanced Photon Source (APS) project, now under construction at Argonne National Laboratory (ANL). The shutter is designed to close in tens of milliseconds to absorb up to 10 kW heat load (with high heat flux). Our shutter design uses innovative enhanced heat transfer tubes to withstand the high heat load. Although designed to be light weight and compact, the very fast movement of the shutter gives rise to concern regarding vibration and dynamic sensitivity. To guarantee long-term functionality and reliability of the shutter, the dynamic behavior should be fully studied. In this paper, the natural frequency and transient dynamic analysis for the shutter during operation are presented. Through analysis of the vibration characteristics, as well as stress and deformation, several options in design were developed and compared, including selection of materials for the shutter and structural details

  9. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  10. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  11. Determination of sources and analysis of micro-pollutants in drinking water

    International Nuclear Information System (INIS)

    Md Pauzi Abdullah; Soh Shiau Chian

    2005-01-01

    The objectives of the study are to develop and validate selected analytical methods for the analysis of micro organics and metals in water; to identify, monitor and assess the levels of micro organics and metals in drinking water supplies; to evaluate the relevancy of the guidelines set in the National Standard of Drinking Water Quality 2001; and to identify the sources of pollution and to carryout risk assessment of exposure to drinking water. The presentation discussed the progress of the work include determination of VOCs (Volatile organic compounds) in drinking water using SPME (Solid phase micro-extraction) extraction techniques, analysis of heavy metals in drinking water, determination of Cr(VI) with ICPES (Inductively coupled plasma emission spectrometry) and the presence of halogenated volatile organic compounds (HVOCs), which is heavily used by agricultural sector, in trace concentrations in waters

  12. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  13. A content analysis of depression-related discourses on Sina Weibo: attribution, efficacy, and information sources.

    Science.gov (United States)

    Pan, Jiabao; Liu, Bingjie; Kreps, Gary L

    2018-06-20

    Depression is a mood disorder that may lead to severe outcomes including mental breakdown, self-injury, and suicide. Potential causes of depression include genetic, sociocultural, and individual-level factors. However, public understandings of depression guided by a complex interplay of media and other societal discourses might not be congruent with the scientific knowledge. Misunderstandings of depression can lead to under-treatment and stigmatization of depression. Against this backdrop, this study aims to achieve a holistic understanding of the patterns and dynamics in discourses about depression from various information sources in China by looking at related posts on social media. A content analysis was conducted with 902 posts about depression randomly selected within a three-year period (2014 to 2016) on the mainstream social media platform in China, Sina Weibo. Posts were analyzed with a focus on attributions of and solutions to depression, attitudes towards depression, and efficacy indicated by the posts across various information sources. Results suggested that depression was most often attributed to individual-level factors. Across all the sources, individual-level attributions were often adopted by state-owned media whereas health and academic experts and organizations most often mentioned biological causes of depression. Citizen journalists and unofficial social groups tended to make societal-level attributions. Overall, traditional media posts suggested the lowest efficacy in coping with depression and the most severe negative outcomes as compared with other sources. The dominance of individual-level attributions and solutions regarding depression on Chinese social media on one hand manifests the public's limited understanding of depression and on the other hand, may further constrain adoption of scientific explanations about depression and exacerbate stigmatization towards depressed individuals. Mass media's posts centered on description of severe

  14. Bioanalytical and instrumental analysis of thyroid hormone disrupting compounds in water sources along the Yangtze River

    International Nuclear Information System (INIS)

    Shi Wei; Wang Xiaoyi; Hu Guanjiu; Hao Yingqun; Zhang Xiaowei; Liu Hongling; Wei Si; Wang Xinru; Yu Hongxia

    2011-01-01

    Thyroid hormone (TH) agonist and antagonist activities of water sources along the Yangtze River in China were surveyed by a green monkey kidney fibroblast (CV-1) cell-based TH reporter gene assay. Instrumental analysis was conducted to identify the responsible thyroid-active compounds. Instrumentally derived L-3,5,3'-triiodothyronine (T 3 ) equivalents (T 3 -EQs) and thyroid receptor (TR) antagonist activity equivalents referring to dibutyl phthalate (DBP-EQs) were calculated from the concentrations of individual congeners. The reporter gene assay demonstrated that three out of eleven water sources contained TR agonist activity equivalents (TR-EQs), ranging from 286 to 293 ng T 3 /L. Anti-thyroid hormone activities were found in all water sources with the TR antagonist activity equivalents referring to DBP (Ant-TR-EQs), ranging from 51.5 to 555.3 μg/L. Comparisons of the equivalents from instrumental and biological assays suggested that high concentrations of DBP and di-2-ethylhexyl phthalate (DEHP) were responsible for the observed TR antagonist activities at some locations along the Yangtze River. - Research highlights: → First of all, we indicated the instrumentally derived L-3,5,3'-triiodothyronine (T 3 ) equivalents (T 3 -EQs) and thyroid receptor (TR) antagonist activity equivalents referring to DBP (DBP-EQs) for the very first time. → Secondly, high concentrations of DBP and DEHP might be responsible for the observed TR antagonist activities at some locations. → Finally, we found that thyroid receptor (TR) antagonist activities were very common in Yangtze River. More attentions should be paid to the TR antagonist activities and the responsible compounds. - In vitro bioassay responses observed in Yangtze River source water extracts showed great TR antagonist activities, and DBP and DEHP were responsible.

  15. YouTube as a source of COPD patient education: A social media content analysis

    Science.gov (United States)

    Stellefson, Michael; Chaney, Beth; Ochipa, Kathleen; Chaney, Don; Haider, Zeerak; Hanik, Bruce; Chavarria, Enmanuel; Bernhardt, Jay M.

    2014-01-01

    Objective Conduct a social media content analysis of COPD patient education videos on YouTube. Methods A systematic search protocol was used to locate 223 videos. Two independent coders evaluated each video to determine topics covered, media source(s) of posted videos, information quality as measured by HONcode guidelines for posting trustworthy health information on the Internet, and viewer exposure/engagement metrics. Results Over half the videos (n=113, 50.7%) included information on medication management, with far fewer videos on smoking cessation (n=40, 17.9%). Most videos were posted by a health agency or organization (n=128, 57.4%), and the majority of videos were rated as high quality (n=154, 69.1%). HONcode adherence differed by media source (Fisher’s Exact Test=20.52, p=.01), with user-generated content (UGC) receiving the lowest quality scores. Overall level of user engagement as measured by number of “likes,” “favorites,” “dislikes,” and user comments was low (mdn range = 0–3, interquartile (IQR) range = 0–16) across all sources of media. Conclusion Study findings suggest that COPD education via YouTube has the potential to reach and inform patients, however, existing video content and quality varies significantly. Future interventions should help direct individuals with COPD to increase their engagement with high-quality patient education videos on YouTube that are posted by reputable health organizations and qualified medical professionals. Patients should be educated to avoid and/or critically view low-quality videos posted by individual YouTube users who are not health professionals. PMID:24659212

  16. Bioanalytical and instrumental analysis of thyroid hormone disrupting compounds in water sources along the Yangtze River

    Energy Technology Data Exchange (ETDEWEB)

    Shi Wei [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China); Wang Xiaoyi [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China); Jiangsu Academy of Environmental Science, Nanjing 210036 (China); Hu Guanjiu; Hao Yingqun [State Environmental Protection Key Laboratory of Monitoring and Analysis for Organic Pollutants in Surface Water, Jiangsu Provincial Environmental Monitoring Center, Nanjing 210036 (China); Zhang Xiaowei [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China); Liu Hongling, E-mail: hlliu@nju.edu.c [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China); Wei Si [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China); Wang Xinru [Key Laboratory of Reproductive Medicine and Institute of Toxicology, Nanjing Medical University, Nanjing 210029 (China); Yu Hongxia, E-mail: hongxiayu@nju.edu.c [State Key Laboratory of Pollution Control and Resource Reuse, School of the Environment, Nanjing University, Nanjing 210093 (China)

    2011-02-15

    Thyroid hormone (TH) agonist and antagonist activities of water sources along the Yangtze River in China were surveyed by a green monkey kidney fibroblast (CV-1) cell-based TH reporter gene assay. Instrumental analysis was conducted to identify the responsible thyroid-active compounds. Instrumentally derived L-3,5,3'-triiodothyronine (T{sub 3}) equivalents (T{sub 3}-EQs) and thyroid receptor (TR) antagonist activity equivalents referring to dibutyl phthalate (DBP-EQs) were calculated from the concentrations of individual congeners. The reporter gene assay demonstrated that three out of eleven water sources contained TR agonist activity equivalents (TR-EQs), ranging from 286 to 293 ng T{sub 3}/L. Anti-thyroid hormone activities were found in all water sources with the TR antagonist activity equivalents referring to DBP (Ant-TR-EQs), ranging from 51.5 to 555.3 {mu}g/L. Comparisons of the equivalents from instrumental and biological assays suggested that high concentrations of DBP and di-2-ethylhexyl phthalate (DEHP) were responsible for the observed TR antagonist activities at some locations along the Yangtze River. - Research highlights: First of all, we indicated the instrumentally derived L-3,5,3'-triiodothyronine (T{sub 3}) equivalents (T{sub 3}-EQs) and thyroid receptor (TR) antagonist activity equivalents referring to DBP (DBP-EQs) for the very first time. Secondly, high concentrations of DBP and DEHP might be responsible for the observed TR antagonist activities at some locations. Finally, we found that thyroid receptor (TR) antagonist activities were very common in Yangtze River. More attentions should be paid to the TR antagonist activities and the responsible compounds. - In vitro bioassay responses observed in Yangtze River source water extracts showed great TR antagonist activities, and DBP and DEHP were responsible.

  17. [Bibliometric analysis of literature regarding integrated schistosomiasis control strategy with emphasis on infectious source control].

    Science.gov (United States)

    Qian, Yi-Li; Wang, Wei; Hong, Qing-Biao; Liang, You-Sheng

    2014-12-01

    To evaluate the outcomes of implementation of integrated schistosomiasis control strategy with emphasis on infectious source control using a bibliometric method. The literature pertaining to integrated schistosomiasis control strategy with emphasis on infectious source control was retrieved from CNKI, Wanfangdata, VIP, PubMed, Web of Science, BIOSIS and Google Scholar, and a bibliometric analysis of literature captured was performed. During the period from January 1, 2004 through September 30, 2014, a total of 94 publications regarding integrated schistosomiasis control strategy with emphasis on infectious source control were captured, including 78 Chinese articles (82.98%) and 16 English papers (17.02%). The Chinese literature was published in 21 national journals, and Chinese Journal of Schistosomiasis Control had the largest number of publications, consisting of 37.23% of total publications; 16 English papers were published in 12 international journals, and PLoS Neglected Tropical Diseases had the largest number of publications (3 publications). There were 37 affiliations publishing these 94 articles, and National Institute of Parasitic Diseases, Chinese Center for Disease Control and Prevention (16 publications), Anhui Institute of Schistosomiasis Control (12 publications) and Hunan Institute of Schistosomiasis Control (9 publications) ranked top three affiliations in number of publications. A total of 157 persons were co-authored in these 94 publications, and Wang, Zhou and Zhang ranked top 3 authors in number of publications. The integrated schistosomiasis control strategy with emphasis on infectious source control has been widely implemented in China, and the achievements obtained from the implementation of this strategy should be summarized and transmitted internationally.

  18. Nuclear microprobe analysis and source apportionment of individual atmospheric aerosol particles

    International Nuclear Information System (INIS)

    Artaxo, P.; Rabello, M.L.C.; Watt, F.; Grime, G.; Swietlicki, E.

    1993-01-01

    In atmospheric aerosol reserach, one key issue is to determine the sources of the airborne particles. Bulk PIXE analysis coupled with receptor modeling provides a useful, but limited view of the aerosol sources influencing one particular site or sample. The scanning nuclear microprobe (SNM) technique is a microanalytical technique that gives unique information on individual aerosol particles. In the SNM analyses a 1.0 μm size 2.4 MeV proton beam from the Oxford SNM was used. The trace elements with Z>11 were measured by the particle induced X-ray emission (PIXE) method with detection limits in the 1-10 ppm range. Carbon, nitrogen and oxygen are measured simultaneously using Rutherford backscattering spectrometry (RBS). Atmospheric aerosol particles were collected at the Brazilian Antarctic Station and at biomass burning sites in the Amazon basin tropical rain forest in Brazil. In the Antarctic samples, the sea-salt aerosol particles were clearly predominating, with NaCl and CaSO 4 as major compounds with several trace elements as Al, Si, P, K, Mn, Fe, Ni, Cu, Zn, Br, Sr, and Pb. Factor analysis of the elemental data showed the presence of four components: 1) Soil dust particles; 2) NaCl particles; 3) CaSO 4 with Sr; and 4) Br and Mg. Strontium, observed at 20-100 ppm levels, was always present in the CaSO 4 particles. The hierarchical cluster procedure gave results similar to the ones obtained through factor analysis. For the tropical rain forest biomass burning aerosol emissions, biogenic particles with a high organic content dominate the particle population, while K, P, Ca, Mg, Zn, and Si are the dominant elements. Zinc at 10-200 ppm is present in biogenic particles rich in P and K. The quantitative aspects and excellent detection limits make SNM analysis of individual aerosol particles a very powerful analytical tool. (orig.)

  19. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    Science.gov (United States)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  20. Noise source analysis of nuclear ship Mutsu plant using multivariate autoregressive model

    International Nuclear Information System (INIS)

    Hayashi, K.; Shimazaki, J.; Shinohara, Y.

    1996-01-01

    The present study is concerned with the noise sources in N.S. Mutsu reactor plant. The noise experiments on the Mutsu plant were performed in order to investigate the plant dynamics and the effect of sea condition and and ship motion on the plant. The reactor noise signals as well as the ship motion signals were analyzed by a multivariable autoregressive (MAR) modeling method to clarify the noise sources in the reactor plant. It was confirmed from the analysis results that most of the plant variables were affected mainly by a horizontal component of the ship motion, that is the sway, through vibrations of the plant structures. Furthermore, the effect of ship motion on the reactor power was evaluated through the analysis of wave components extracted by a geometrical transform method. It was concluded that the amplitude of the reactor power oscillation was about 0.15% in normal sea condition, which was small enough for safe operation of the reactor plant. (authors)

  1. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Science.gov (United States)

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  2. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  3. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  4. Molecular Ionization-Desorption Analysis Source (MIDAS) for Mass Spectrometry: Thin-Layer Chromatography

    Science.gov (United States)

    Winter, Gregory T.; Wilhide, Joshua A.; LaCourse, William R.

    2016-02-01

    Molecular ionization-desorption analysis source (MIDAS), which is a desorption atmospheric pressure chemical ionization (DAPCI) type source, for mass spectrometry has been developed as a multi-functional platform for the direct sampling of surfaces. In this article, its utility for the analysis of thin-layer chromatography (TLC) plates is highlighted. Amino acids, which are difficult to visualize without staining reagents or charring, were detected and identified directly from a TLC plate. To demonstrate the full potential of MIDAS, all active ingredients from an analgesic tablet, separated on a TLC plate, were successfully detected using both positive and negative ion modes. The identity of each of the compounds was confirmed from their mass spectra and compared against standards. Post separation, the chemical signal (blue permanent marker) as reference marks placed at the origin and solvent front were used to calculate retention factor (Rf) values from the resulting ion chromatogram. The quantitative capabilities of the device were exhibited by scanning caffeine spots on a TLC plate of increasing sample amount. A linear curve based on peak are, R2 = 0.994, was generated for seven spots ranging from 50 to 1000 ng of caffeine per spot.

  5. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  6. An Analysis of Source Tilting and Sub-cell Opacity Sampling for IMC

    Energy Technology Data Exchange (ETDEWEB)

    Wollaeger, Ryan T. [Los Alamos National Laboratory; Urbatsch, Todd J. [Los Alamos National Laboratory; Wollaber, Allan B. [Los Alamos National Laboratory; Densmore, Jeffery D. [Los Alamos National Laboratory

    2012-08-02

    Implicit Monte Carlo (IMC) is a stochastic method for solving the radiative transfer equations for multiphysics application with the material in local thermodynamic equilibrium. The IMC method employs a fictitious scattering term that is computed from an implicit discretization of the material temperature equation. Unfortunately, the original histogram representation of the temperature and opacity with respect to the spatial domain leads to nonphysically fast propagation of radiation waves through optically thick material. In the past, heuristic source tilting schemes have been used to mitigate the numerical teleportation error of the radiation particles in IMC that cause this overly rapid radiation wave propagation. While improving the material temperature profile throughout the time duration, these tilting schemes alone do not generally alleviate the teleportation error to suitable levels. Another means of potentially reducing teleportation error in IMC is implementing continuous sub-cell opacities based on sub-cell temperature profiles. We present here an analysis of source tilting and continuous sub-cell opacity sampling applied to various discretizations of the temperature equation. Through this analysis, we demonstrate that applying both heuristics does not necessarily yield more accurate results if the discretization of the material equation is inconsistent with the Monte Carlo sub-cell transport.

  7. Low temperature heat source for power generation: Exhaustive analysis of a carbon dioxide transcritical power cycle

    International Nuclear Information System (INIS)

    Velez, Fredy; Segovia, Jose; Chejne, Farid; Antolin, Gregorio; Quijano, Ana; Carmen Martin, M.

    2011-01-01

    The main results of a theoretical work on the use of a low temperature heat source for power generation through a carbon dioxide transcritical power cycle are reported in this paper. The procedure for analyzing the behaviour of the proposed cycle consisted in modifying the input pressure to the turbine from 66 bar, maintained constant each evaluated temperature (60 o C, 90 o C, 120 o C and 150 o C) until the net work was approximately zero. As a result, the maximum exergy efficiency was 50%, while the energy efficiencies obtained were 9.8%, 7.3%, 4.9% and 2.4% and the net specific work was 18.2 kJ/kg, 12.8 kJ/kg, 7.8 kJ/kg and 3.5 kJ/kg, respectively. Furthermore, the effect of the addition of an internal heat exchanger, which obviously supposed an increase in the efficiency, was analyzed. The analysis of the proposed system shows the viability of implementing this type of process as an energy alternative and/or strengthener of non-conventional energy sources in non-provided zones, or for increasing the energy efficiency in the industry. -- Highlights: → Energy and exergy analysis of a carbon dioxide transcritical power cycle is reported. → The effect of the inlet temperature to the turbine is evaluated. → Conditions of maximum efficiency and maximum net work are compared. → The inclusion of an IHX is also analysed.

  8. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  9. Reliability and validity analysis of the open-source Chinese Foot and Ankle Outcome Score (FAOS).

    Science.gov (United States)

    Ling, Samuel K K; Chan, Vincent; Ho, Karen; Ling, Fona; Lui, T H

    2017-12-21

    Develop the first reliable and validated open-source outcome scoring system in the Chinese language for foot and ankle problems. Translation of the English FAOS into Chinese following regular protocols. First, two forward-translations were created separately, these were then combined into a preliminary version by an expert committee, and was subsequently back-translated into English. The process was repeated until the original and back translations were congruent. This version was then field tested on actual patients who provided feedback for modification. The final Chinese FAOS version was then tested for reliability and validity. Reliability analysis was performed on 20 subjects while validity analysis was performed on 50 subjects. Tools used to validate the Chinese FAOS were the SF36 and Pain Numeric Rating Scale (NRS). Internal consistency between the FAOS subgroups was measured using Cronbach's alpha. Spearman's correlation was calculated between each subgroup in the FAOS, SF36 and NRS. The Chinese FAOS passed both reliability and validity testing; meaning it is reliable, internally consistent and correlates positively with the SF36 and the NRS. The Chinese FAOS is a free, open-source scoring system that can be used to provide a relatively standardised outcome measure for foot and ankle studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  11. Characterization of polar organic compounds and source analysis of fine organic aerosols in Hong Kong

    Science.gov (United States)

    Li, Yunchun

    Organic aerosols, as an important fraction of airborne particulate mass, significantly affect the environment, climate, and human health. Compared with inorganic species, characterization of individual organic compounds is much less complete and comprehensive because they number in thousands or more and are diverse in chemical structures. The source contributions of organic aerosols are far from being well understood because they can be emitted from a variety of sources as well as formed from photochemical reactions of numerous precursors. This thesis work aims to improve the characterization of polar organic compounds and source apportionment analysis of fine organic carbon (OC) in Hong Kong, which consists of two parts: (1) An improved analytical method to determine monocarboxylic acids, dicarboxylic acids, ketocarboxylic acids, and dicarbonyls collected on filter substrates has been established. These oxygenated compounds were determined as their butyl ester or butyl acetal derivatives using gas chromatography-mass spectrometry. The new method made improvements over the original Kawamura method by eliminating the water extraction and evaporation steps. Aerosol materials were directly mixed with the BF 3/BuOH derivatization agent and the extracting solvent hexane. This modification improves recoveries for both the more volatile and the less water-soluble compounds. This improved method was applied to study the abundances and sources of these oxygenated compounds in PM2.5 aerosol samples collected in Hong Kong under different synoptic conditions during 2003-2005. These compounds account for on average 5.2% of OC (range: 1.4%-13.6%) on a carbon basis. Oxalic acid was the most abundant species. Six C2 and C3 oxygenated compounds, namely oxalic, malonic, glyoxylic, pyruvic acids, glyoxal, and methylglyoxal, dominated this suite of oxygenated compounds. More efforts are therefore suggested to focus on these small compounds in understanding the role of oxygenated

  12. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  13. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Application of radionuclide sources for excitation in energy-dispersive X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Hoffmann, P.

    1986-01-01

    X-ray fluorescence (XRF) analysis is in broad application in many fields of science where elemental determinations are necessary. Solid and liquid samples are analyzed by this method. Solids are introduced in thin or thick samples as melted glass, pellets, powders or as original specimen. The excitation of X-ray spectra can be performed by specific and polychromic radiation of X-ray tubes, by protons, deuterons, α-particles, heavy ions and synchrotron radiation from accelerators and by α-particles, X- and γ-rays and by bremsstrahlung generated by β - -particles from radionuclide sources. The radionuclides are devided into groups with respect to their decay mode and the energy of the emitted radiation. The broad application of radionuclides in XRF excitation is shown in examples as semi-quantitative analysis of glasses, as quantitative analysis of coarse ceramics and as quantitative determination of heavy elements (mainly actinides) in solutions. The advantages and disadvantages of radionuclide excitation in XRF analysis are discussed. (orig.) [de

  15. Preliminary fracture analysis of the core pressure boundary tube for the Advanced Neutron Source Research Reactor

    International Nuclear Information System (INIS)

    Schulz, K.C.

    1995-08-01

    The outer core pressure boundary tube (CPBT) of the Advanced neutron Source (ANS) reactor being designed at Oak Ridge National Laboratory is currently specified as being composed of 6061-T6 aluminum. ASME Boiler and Pressure Vessel Code fracture analysis rules for nuclear components are based on the use of ferritic steels; the expressions, tables, charts and equations were all developed from tests and analyses conducted for ferritic steels. Because of the nature of the Code, design with thin aluminum requires analytical approaches that do not directly follow the Code. The intent of this report is to present a methodology comparable to the ASME Code for ensuring the prevention of nonductile fracture of the CPBT in the ANS reactor. 6061-T6 aluminum is known to be a relatively brittle material; the linear elastic fracture mechanics (LEFM) approach is utilized to determine allowable flaw sizes for the CPBT. A J-analysis following the procedure developed by the Electric Power Research Institute was conducted as a check; the results matched those for the LEFM analysis for the cases analyzed. Since 6061-T6 is known to embrittle when irradiated, the reduction in K Q due to irradiation is considered in the analysis. In anticipation of probable requirements regarding maximum allowable flaw size, a survey of nondestructive inspection capabilities is also presented. A discussion of probabilistic fracture mechanics approaches, principally Monte Carlo techniques, is included in this report as an introduction to what quantifying the probability of nonductile failure of the CPBT may entail

  16. Updated pipe break analysis for Advanced Neutron Source Reactor conceptual design

    International Nuclear Information System (INIS)

    Wendel, M.W.; Chen, N.C.J.; Yoder, G.L.

    1994-01-01

    The Advanced Neutron Source Reactor (ANSR) is a research reactor to be built at the Oak Ridge National Laboratory that will supply the highest continuous neutron flux levels of any reactor in the world. It uses plate-type fuel with high-mass-flux and highly subcooled heavy water as the primary coolant. The Conceptual Safety Analysis for the ANSR was completed in June 1992. The thermal-hydraulic pipe-break safety analysis (performed with a specialized version of RELAP5/MOD3) focused primarily on double-ended guillotine breaks of the primary piping and some core-damage mitigation options for such an event. Smaller, instantaneous pipe breaks in the cold- and hot-leg piping were also analyzed to a limited extent. Since the initial analysis for the conceptual design was completed, several important changes to the RELAP5 input model have been made reflecting improvements in the fuel grading and changes in the elevation of the primary coolant pumps. Also, a new philosophy for pipe-break safety analysis (similar to that adopted for the New Production Reactor) accentuates instantaneous, limited flow area pipe-break accidents in addition to finite-opening-time, double-ended guillotine breaks of the major coolant piping. This paper discloses the results of the most recent instantaneous pipe-break calculations

  17. Blind source separation analysis of PET dynamic data: a simple method with exciting MR-PET applications

    Energy Technology Data Exchange (ETDEWEB)

    Oros-Peusquens, Ana-Maria; Silva, Nuno da [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Weiss, Carolin [Department of Neurosurgery, University Hospital Cologne, 50924 Cologne (Germany); Stoffels, Gabrielle; Herzog, Hans; Langen, Karl J [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Shah, N Jon [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Jülich-Aachen Research Alliance (JARA) - Section JARA-Brain RWTH Aachen University, 52074 Aachen (Germany)

    2014-07-29

    Denoising of dynamic PET data improves parameter imaging by PET and is gaining momentum. This contribution describes an analysis of dynamic PET data by blind source separation methods and comparison of the results with MR-based brain properties.

  18. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  19. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  20. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  1. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    Science.gov (United States)

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Transient analysis of the new Cold Source at the FRM-II

    International Nuclear Information System (INIS)

    Gutsmiedl, E.; Posselt, H.; Scheuer, A.

    2003-01-01

    The new Cold Source (CNS) at the FRM-II research reactor is completely installed. This paper reports on the results of the transient analysis in the design status for this facility for producing cold neutrons for neutron experiments, the implementation of the results in the design of the mechanical components, the measurements at the cold tests and the comparison with the data of the transient analysis. The important load cases are fixed in the system description and the design data sheet of the CNS. A transient analysis was done with the computer program ESATAN, the nodal configuration was identical with the planned system of the CNS and the boundary conditions were chosen so, that conservative results can be expected. The following transients of the load cases in the piping system behind the inpile part 1) normal storage of D 2 at the hydride storage vessel 2) breakdown of cooling system of the CNS and transfer of D 2 to the buffer tank 3) rapid charge of D 2 to the buffer tank with break of the insulation vacuum and flooding of Neon 4) reloading of the D 2 from the buffer tank to the D 2 hydride storage vessel were calculated. Additionally the temperature distribution for these transients in the connecting flanges of the systems to the inpile part were analysed. The temperature distributions in the flange region were take into account for the strength calculation of the flange construction. The chosen construction shows allowable values and a leak tight flange connection for the load cases. The piping system was designed to the lowest expected temperatures. The load cases in the moderator tank were take into account in the stress analysis and the fatigue analysis of the vacuum vessel and the moderator vessel. The results shows allowable stresses. The results shows that a transient analysis is necessary and helpful for good design of the CNS. (author)

  3. OVAS: an open-source variant analysis suite with inheritance modelling.

    Science.gov (United States)

    Mozere, Monika; Tekman, Mehmet; Kari, Jameela; Bockenhauer, Detlef; Kleta, Robert; Stanescu, Horia

    2018-02-08

    The advent of modern high-throughput genetics continually broadens the gap between the rising volume of sequencing data, and the tools required to process them. The need to pinpoint a small subset of functionally important variants has now shifted towards identifying the critical differences between normal variants and disease-causing ones. The ever-increasing reliance on cloud-based services for sequence analysis and the non-transparent methods they utilize has prompted the need for more in-situ services that can provide a safer and more accessible environment to process patient data, especially in circumstances where continuous internet usage is limited. To address these issues, we herein propose our standalone Open-source Variant Analysis Sequencing (OVAS) pipeline; consisting of three key stages of processing that pertain to the separate modes of annotation, filtering, and interpretation. Core annotation performs variant-mapping to gene-isoforms at the exon/intron level, append functional data pertaining the type of variant mutation, and determine hetero/homozygosity. An extensive inheritance-modelling module in conjunction with 11 other filtering components can be used in sequence ranging from single quality control to multi-file penetrance model specifics such as X-linked recessive or mosaicism. Depending on the type of interpretation required, additional annotation is performed to identify organ specificity through gene expression and protein domains. In the course of this paper we analysed an autosomal recessive case study. OVAS made effective use of the filtering modules to recapitulate the results of the study by identifying the prescribed compound-heterozygous disease pattern from exome-capture sequence input samples. OVAS is an offline open-source modular-driven analysis environment designed to annotate and extract useful variants from Variant Call Format (VCF) files, and process them under an inheritance context through a top-down filtering schema of

  4. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  5. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  6. Adapting Controlled-source Coherence Analysis to Dense Array Data in Earthquake Seismology

    Science.gov (United States)

    Schwarz, B.; Sigloch, K.; Nissen-Meyer, T.

    2017-12-01

    Exploration seismology deals with highly coherent wave fields generated by repeatable controlled sources and recorded by dense receiver arrays, whose geometry is tailored to back-scattered energy normally neglected in earthquake seismology. Owing to these favorable conditions, stacking and coherence analysis are routinely employed to suppress incoherent noise and regularize the data, thereby strongly contributing to the success of subsequent processing steps, including migration for the imaging of back-scattering interfaces or waveform tomography for the inversion of velocity structure. Attempts have been made to utilize wave field coherence on the length scales of passive-source seismology, e.g. for the imaging of transition-zone discontinuities or the core-mantle-boundary using reflected precursors. Results are however often deteriorated due to the sparse station coverage and interference of faint back-scattered with transmitted phases. USArray sampled wave fields generated by earthquake sources at an unprecedented density and similar array deployments are ongoing or planned in Alaska, the Alps and Canada. This makes the local coherence of earthquake data an increasingly valuable resource to exploit.Building on the experience in controlled-source surveys, we aim to extend the well-established concept of beam-forming to the richer toolbox that is nowadays used in seismic exploration. We suggest adapted strategies for local data coherence analysis, where summation is performed with operators that extract the local slope and curvature of wave fronts emerging at the receiver array. Besides estimating wave front properties, we demonstrate that the inherent data summation can also be used to generate virtual station responses at intermediate locations where no actual deployment was performed. Owing to the fact that stacking acts as a directional filter, interfering coherent wave fields can be efficiently separated from each other by means of coherent subtraction. We

  7. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  8. Comparative analysis of public's perception of economic feasibility and reality for selected energy sources in Korea

    International Nuclear Information System (INIS)

    Roh, Seungkook; Jeong, Ik; Lee, Kibog; Kim, Dongwook; Kim, Hyunjin

    2016-01-01

    Controversy on nuclear energy has persisted ever since, but nuclear energy has maintained around 30% of electricity generation in Korea. This is because Korean wants to secure energy security and diversity of energy sources, but the most rational driver behind nuclear energy is the economic feasibility. Looking at the actual prices of electricity traded in the Korean Power Exchange, the price of electricity generated by nuclear energy is 39.1 Korean won per kWh, which is lower than that of other sources: 58.9 (bituminous coal), 221.8 (oil), 158.6 (gas), 170.9 (hydropower), 162.8 (wind) and 463.1 (photovoltaic). However only experts, regulators and people from electricity generation industry are aware of this fact and the public does not seem to be perceiving this correctly. This research, therefore, will compare the economic feasibility of energy sources and how it is perceived by the public in general. This research was able to identify the large gap between public's perception on and reality of economic feasibility of energy sources. There are two possible reasons for the gap. Firstly, the electricity price paid by the public is agnostic of energy sources. Therefore, it is difficult for the public to be aware that the electricity from nuclear energy is benefiting them and hence the public would be indifferent to the real economic feasibility. Secondly, public's awareness of nuclear reactor decommissioning and spent fuel processing along with easier access to relevant information the media would have played a role. In fact, number of press and media has questioned the economic feasibility of nuclear energy. However, the price of electricity generated by nuclear energy includes costs for future activities such as decommissioning, radioactive waste disposal and spent fuel disposal. The public seems to be not aware of such fact and therefore favoring the media. Such analysis leads to two major policy implications. Most importantly, the government should emphasize the

  9. Sources and transformations of nitrate from streams draining varying land uses: Evidence from dual isotope analysis

    Science.gov (United States)

    Burns, Douglas A.; Boyer, E.W.; Elliott, E.M.; Kendall, C.

    2009-01-01

    Knowledge of key sources and biogeochemical processes that affect the transport of nitrate (NO3-) in streams can inform watershed management strategies for controlling downstream eutrophication. We applied dual isotope analysis of NO3- to determine the dominant sources and processes that affect NO3- concentrations in six stream/river watersheds of different land uses. Samples were collected monthly at a range of flow conditions for 15 mo during 2004-05 and analyzed for NO3- concentrations, ?? 15NNO3, and ??18ONO3. Samples from two forested watersheds indicated that NO3- derived from nitrification was dominant at baseflow. A watershed dominated by suburban land use had three ??18ONO3 values greater than +25???, indicating a large direct contribution of atmospheric NO 3- transported to the stream during some high flows. Two watersheds with large proportions of agricultural land use had many ??15NNO3 values greater than +9???, suggesting an animal waste source consistent with regional dairy farming practices. These data showed a linear seasonal pattern with a ??18O NO3:??15NNO3 of 1:2, consistent with seasonally varying denitrification that peaked in late summer to early fall with the warmest temperatures and lowest annual streamflow. The large range of ?? 15NNO3 values (10???) indicates that NO 3- supply was likely not limiting the rate of denitrification, consistent with ground water and/or in-stream denitrification. Mixing of two or more distinct sources may have affected the seasonal isotope patterns observed in these two agricultural streams. In a mixed land use watershed of large drainage area, none of the source and process patterns observed in the small streams were evident. These results emphasize that observations at watersheds of a few to a few hundred km2 may be necessary to adequately quantify the relative roles of various NO 3- transport and process patterns that contribute to streamflow in large basins. Copyright ?? 2009 by the American Society of

  10. P-wave pulse analysis to retrieve source and propagation effects in the case of Vrancea earthquakes

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Placinta, A.; Grecu, B.; Radulian, M.

    2004-01-01

    Seismic source parameters and attenuation structure properties are obtained from the first P-wave pulse analysis and empirical Green's function deconvolution. The P pulse characteristics are combined effects of source and path properties. To reproduce the real source and structure parameters it is crucial to apply a method able to distinguish between the different factors affecting the observed seismograms. For example the empirical Green's function deconvolution method (Hartzell, 1978) allows the retrieval of the apparent source time function or source spectrum corrected for path, site and instrumental effects. The apparent source duration is given by the width of the deconvoluted source pulse and is directly related to the source dimension. Once the source time function established, next we can extract the parameters related to path effects. The difference between the pulse recorded at a given station and the source pulse obtained by deconvolution is a measure of the attenuation along the path from focus to the station. On the other hand, the pulse width variations with azimuth depend critically on the fault plane orientation and source directivity. In favourable circumstances (high signal/noise ratio, high resolution and station coverage), the method of analysis proposed in this paper allows the constraint of the rupture plane among the two nodal planes characterizing the fault plane solution, even for small events. P-wave pulse analysis was applied for 25 Vrancea earthquakes recorded between 1999 and 2003 by the Romanian local network to determine source parameters and attenuation properties. Our results outline high-stress drop seismic energy release with relatively simple rupture process for the considered events and strong lateral variation of attenuation of seismic waves across Carpathians Arc. (authors)

  11. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  12. Source Attribution of Cyanides using Anionic Impurity Profiling, Stable Isotope Ratios, Trace Elemental Analysis and Chemometrics

    Energy Technology Data Exchange (ETDEWEB)

    Mirjankar, Nikhil S.; Fraga, Carlos G.; Carman, April J.; Moran, James J.

    2016-01-08

    Chemical attribution signatures (CAS) for chemical threat agents (CTAs) are being investigated to provide an evidentiary link between CTAs and specific sources to support criminal investigations and prosecutions. In a previous study, anionic impurity profiles developed using high performance ion chromatography (HPIC) were demonstrated as CAS for matching samples from eight potassium cyanide (KCN) stocks to their reported countries of origin. Herein, a larger number of solid KCN stocks (n = 13) and, for the first time, solid sodium cyanide (NaCN) stocks (n = 15) were examined to determine what additional sourcing information can be obtained through anion, carbon stable isotope, and elemental analyses of cyanide stocks by HPIC, isotope ratio mass spectrometry (IRMS), and inductively coupled plasma optical emission spectroscopy (ICP-OES), respectively. The HPIC anion data was evaluated using the variable selection methods of Fisher-ratio (F-ratio), interval partial least squares (iPLS), and genetic algorithm-based partial least squares (GAPLS) and the classification methods of partial least squares discriminate analysis (PLSDA), K nearest neighbors (KNN), and support vector machines discriminate analysis (SVMDA). In summary, hierarchical cluster analysis (HCA) of anion impurity profiles from multiple cyanide stocks from six reported country of origins resulted in cyanide samples clustering into three groups: Czech Republic, Germany, and United States, independent of the associated alkali metal (K or Na). The three country groups were independently corroborated by HCA of cyanide elemental profiles and corresponded to countries with known solid cyanide factories. Both the anion and elemental CAS are believed to originate from the aqueous alkali hydroxides used in cyanide manufacture. Carbon stable isotope measurements resulted in two clusters: Germany and United States (the single Czech stock grouped with United States stocks). The carbon isotope CAS is believed to

  13. Application of group analysis to the spatially homogeneous and isotropic Boltzmann equation with source using its Fourier image

    International Nuclear Information System (INIS)

    Grigoriev, Yurii N; Meleshko, Sergey V; Suriyawichitseranee, Amornrat

    2015-01-01

    Group analysis of the spatially homogeneous and molecular energy dependent Boltzmann equations with source term is carried out. The Fourier transform of the Boltzmann equation with respect to the molecular velocity variable is considered. The correspondent determining equation of the admitted Lie group is reduced to a partial differential equation for the admitted source. The latter equation is analyzed by an algebraic method. A complete group classification of the Fourier transform of the Boltzmann equation with respect to a source function is given. The representation of invariant solutions and corresponding reduced equations for all obtained source functions are also presented. (paper)

  14. Controlled source electromagnetic data analysis with seismic constraints and rigorous uncertainty estimation in the Black Sea

    Science.gov (United States)

    Gehrmann, R. A. S.; Schwalenberg, K.; Hölz, S.; Zander, T.; Dettmer, J.; Bialas, J.

    2016-12-01

    In 2014 an interdisciplinary survey was conducted as part of the German SUGAR project in the Western Black Sea targeting gas hydrate occurrences in the Danube Delta. Marine controlled source electromagnetic (CSEM) data were acquired with an inline seafloor-towed array (BGR), and a two-polarization horizontal ocean-bottom source and receiver configuration (GEOMAR). The CSEM data are co-located with high-resolution 2-D and 3-D seismic reflection data (GEOMAR). We present results from 2-D regularized inversion (MARE2DEM by Kerry Key), which provides a smooth model of the electrical resistivity distribution beneath the source and multiple receivers. The 2-D approach includes seafloor topography and structural constraints from seismic data. We estimate uncertainties from the regularized inversion and compare them to 1-D Bayesian inversion results. The probabilistic inversion for a layered subsurface treats the parameter values and the number of layers as unknown by applying reversible-jump Markov-chain Monte Carlo sampling. A non-diagonal data covariance matrix obtained from residual error analysis accounts for correlated errors. The resulting resistivity models show generally high resistivity values between 3 and 10 Ωm on average which can be partly attributed to depleted pore water salinities due to sea-level low stands in the past, and locally up to 30 Ωm which is likely caused by gas hydrates. At the base of the gas hydrate stability zone resistivities rise up to more than 100 Ωm which could be due to gas hydrate as well as a layer of free gas underneath. However, the deeper parts also show the largest model parameter uncertainties. Archie's Law is used to derive estimates of the gas hydrate saturation, which vary between 30 and 80% within the anomalous layers considering salinity and porosity profiles from a distant DSDP bore hole.

  15. Chemical and Microbiological Analysis of Certain Water Sources and Industrial Wastewater Samples in Dakahlia Governorate

    International Nuclear Information System (INIS)

    El-Fadaly, H.; El-Defrawy, M.M.; El-Zawawy, F.; Makia, D.

    1999-01-01

    The chemical analysis included quantitative measurement of electrical conductivity, alkalinity , hardness sulphate, ph, total dissolved solids, chloride, as well as dissolved oxygen was carried out. The microbiological examination for different water sources and industrial wastewater samples was also conducted. some of heavy metals, Co 2+ Cu 2+ Fe 3+ and Mn 2+ were determined in fresh water, while other metals, such as Cr 6+ , Co 2+ , Zn 2+ and Ni 2+ were measured in industrial wastewater. Results of the chemical analysis showed that all measured parameters were found within the limitation either national or international law, except some samples which showed higher values than the permissible limits for some measured parameters. The microbiological analysis exhibited presence of yeasts, fungi and bacteria. Most bacterial isolates were short rod, spore formers as well as coccoid shaped bacteria. The efficiency of water treatment process on the reduction of microbial load was also calculated. Regarding the pathogenic bacteria, data showed that neither water samples nor industrial wastewater contain pathogens when using specific cultivation media for the examination. Furthermore, data proved the possibility of recycling of the tested industrial wastewater on which some microorganisms can grow. Data showed that the percent of heavy metals removal can reach to more than 70% in some cases as a result to bacterial treatment of industrial wastewater

  16. Multivariate analysis for source identification of pollution in sediment of Linggi River, Malaysia.

    Science.gov (United States)

    Elias, Md Suhaimi; Ibrahim, Shariff; Samuding, Kamarudin; Rahman, Shamsiah Ab; Wo, Yii Mei; Daung, Jeremy Andy Dominic

    2018-03-29

    Rapid socioeconomic development in the Linggi River Basin has contributed to the significant increase of pollution discharge into the Linggi River and its adjacent coastal areas. The toxic element contents and distributions in the sediment samples collected along the Linggi River were determined using neutron activation analysis (NAA) and inductively coupled plasma-mass spectrometry (ICP-MS) techniques. The measured mean concentration of As, Cd, Pb, Sb, U, Th and Zn is relatively higher compared to the continental crust value of the respective element. Most of the elements (As, Cr, Fe, Pb, Sb and Zn) exceeded the freshwater sediment quality guideline-threshold effect concentration (FSQG-TEC) value. Downstream stations of the Linggi River showed that As concentrations in sediment exceeded the freshwater sediment quality guideline-probable effect concentration (FSQG-PEC) value. This indicates that the concentration of As will give an adverse effect to the growth of sediment-dwelling organisms. Generally, the Linggi River sediment can be categorised as unpolluted to strongly polluted and unpolluted to strongly to extremely polluted. The correlation matrix of metal-metal relationship, principle component analysis (PCA) and cluster analysis (CA) indicates that the pollution sources of Cu, Ni, Zn, Cd and Pb in sediments of the Linggi River originated from the industry of electronics and electroplating. Elements of As, Cr, Sb and Fe mainly originated from motor-vehicle workshops and metal work, whilst U and Th originated from natural processes such as terrestrial runoff and land erosion.

  17. Aluminium and copper analysis in metallic alloys by neutron activation analysis from an 241 Am-Be source

    International Nuclear Information System (INIS)

    Carvalho, J. de.

    1980-01-01

    Aluminium and copper have been determined in aluminium alloys by the method of activation with neutrons from an 241 Am-Be source of intensity 9,8 x 10 6 n/s. The activity induced due to reactions 27 Al (n, γ) 28 Al and 63 Cu (n, γ) 64 Cu have been measured with a NaI (Tl) detector coupled to a single channel system. In order to obtain the samples and standards of about the same composition, the material to be irradiated was powdered. In view of low intensity of neutron source it was necessary to use samples of up to 50 g. A series of preliminary irradiations were carried out to ensure that the geometry for the irradiation and for the counting are reproducible. The results have been compared with those obtained by chemical methods. Assuming that the results obtained by chemical method is exact, a maximum relative error of 3,6% is obtained by this method. The method has a good reproducibility. The time needed for analysis of aluminium and copper are 18 min and 2 hours 40 minutes respectively. Four different samples were analysed. The average of five measurements for one of the samples was: 88.0% for aluminium and 10.0% for copper. The standard deviation and coefficient of variation were 0,8 and 1.0% for aluminium and 0,2 and 2.0% for copper. (author)

  18. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    International Nuclear Information System (INIS)

    Eriksson, E.; Andersen, H. R.; Ledin, A.

    2008-01-01

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens

  19. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, E., E-mail: eve@env.dtu.dk; Andersen, H. R.; Ledin, A. [Technical University of Denmark, Department of Environmental Engineering (Denmark)

    2008-12-15

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens.

  20. Petrographic Analysis and Geochemical Source Correlation of Pigeon Peak, Sutter Buttes, CA

    Science.gov (United States)

    Novotny, N. M.; Hausback, B. P.

    2013-12-01

    The Sutter Buttes are a volcanic complex located in the center of the Great Valley north of Sacramento. They are comprised of numerous inter-intruding andesite and rhyolite lava domes of varying compositions surrounded by a shallow rampart of associated tephras. The Pigeon Peak block-and-ash flow sequence is located in the rampart and made up of a porphyritic Biotite bearing Hornblende Andesite. The andesite blocks demonstrate a high degree of propylization in hornblende crystals, highly zoned plagioclase, trace olivine, and display a red to gray color gradation. DAR is an andesite dome located less than one mile from Pigeon Peak. Of the 15 to 25 andesite lava domes within four miles from Pigeon Peak, only DAR displays trace olivine, red to grey color stratification, low biotite content, and propylitized hornblende. These characteristic similarities suggest that DAR may be the source for Pigeon Peak. My investigation used microprobe analysis of the DAR and Pigeon Peak feldspar crystals to identify the magmatic history of the magma body before emplacement. Correlation of the anorthite zoning within the feldspars from both locations support my hypothesis that DAR is the source of the Pigeon Peak block-and-ash flow.

  1. Analysis of ultrasonically rotating droplet using moving particle semi-implicit and distributed point source methods

    Science.gov (United States)

    Wada, Yuji; Yuge, Kohei; Tanaka, Hiroki; Nakamura, Kentaro

    2016-07-01

    Numerical analysis of the rotation of an ultrasonically levitated droplet with a free surface boundary is discussed. The ultrasonically levitated droplet is often reported to rotate owing to the surface tangential component of acoustic radiation force. To observe the torque from an acoustic wave and clarify the mechanism underlying the phenomena, it is effective to take advantage of numerical simulation using the distributed point source method (DPSM) and moving particle semi-implicit (MPS) method, both of which do not require a calculation grid or mesh. In this paper, the numerical treatment of the viscoacoustic torque, which emerges from the viscous boundary layer and governs the acoustical droplet rotation, is discussed. The Reynolds stress traction force is calculated from the DPSM result using the idea of effective normal particle velocity through the boundary layer and input to the MPS surface particles. A droplet levitated in an acoustic chamber is simulated using the proposed calculation method. The droplet is vertically supported by a plane standing wave from an ultrasonic driver and subjected to a rotating sound field excited by two acoustic sources on the side wall with different phases. The rotation of the droplet is successfully reproduced numerically and its acceleration is discussed and compared with those in the literature.

  2. Optimization of source and detector configurations based on Cramer-Rao lower bound analysis

    Science.gov (United States)

    Chen, Ling; Chen, Nanguang

    2011-03-01

    Optimization of source and detector (SD) arrangements in a diffuse optical tomography system is helpful for improving measurements' sensitivity to localized changes in imaging domain and enhancing the capacity of noise resistance. We introduced a rigorous and computationally efficient methodology and adapt it into the diffuse optics field to realize the optimizations of SD arrangements. Our method is based on Cramer-Rao lower bound analysis, which combines the diffusion-forward model and a noise model together. This method can be used to investigate the performance of the SD arrangements through quantitative estimations of lower bounds of the standard variances of the reconstructed perturbation depths and values. More importantly, it provides direct estimations of parameters without solving the inverse problem. Simulations are conducted in the reflection geometry to validate the effectiveness of the method on selections of the optimized SD sets, with a fixed number of sources and detectors, from an SD group on a planar probe surface. The impacts of different noise levels and target perturbation depths are considered in the simulations. It is demonstrated that the SD sets selected by this method afford better reconstructed images. This methodology can be adapted to other probe surfaces and other imaging geometries.

  3. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  4. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  5. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  6. Analysis of flood inundation in ungauged basins based on multi-source remote sensing data.

    Science.gov (United States)

    Gao, Wei; Shen, Qiu; Zhou, Yuehua; Li, Xin

    2018-02-09

    Floods are among the most expensive natural hazards experienced in many places of the world and can result in heavy losses of life and economic damages. The objective of this study is to analyze flood inundation in ungauged basins by performing near-real-time detection with flood extent and depth based on multi-source remote sensing data. Via spatial distribution analysis of flood extent and depth in a time series, the inundation condition and the characteristics of flood disaster can be reflected. The results show that the multi-source remote sensing data can make up the lack of hydrological data in ungauged basins, which is helpful to reconstruct hydrological sequence; the combination of MODIS (moderate-resolution imaging spectroradiometer) surface reflectance productions and the DFO (Dartmouth Flood Observatory) flood database can achieve the macro-dynamic monitoring of the flood inundation in ungauged basins, and then the differential technique of high-resolution optical and microwave images before and after floods can be used to calculate flood extent to reflect spatial changes of inundation; the monitoring algorithm for the flood depth combining RS and GIS is simple and easy and can quickly calculate the depth with a known flood extent that is obtained from remote sensing images in ungauged basins. Relevant results can provide effective help for the disaster relief work performed by government departments.

  7. YouTube as a source of information on skin bleaching: a content analysis.

    Science.gov (United States)

    Basch, C H; Brown, A A; Fullwood, M D; Clark, A; Fung, I C-H; Yin, J

    2018-06-01

    Skin bleaching is a common, yet potentially harmful body modification practice. To describe the characteristics of the most widely viewed YouTube™ videos related to skin bleaching. The search term 'skin bleaching' was used to identify the 100 most popular English-language YouTube videos relating to the topic. Both descriptive and specific information were noted. Among the 100 manually coded skin-bleaching YouTube videos in English, there were 21 consumer-created videos, 45 internet-based news videos, 30 television news videos and 4 professional videos. Excluding the 4 professional videos, we limited our content categorization and regression analysis to 96 videos. Approximately 93% (89/96) of the most widely viewed videos mentioned changing how you look and 74% (71/96) focused on bleaching the whole body. Of the 96 videos, 63 (66%) of videos showed/mentioned a transformation. Only about 14% (13/96) mentioned that skin bleaching is unsafe. The likelihood of a video selling a skin bleaching product was 17 times higher in internet videos compared with consumer videos (OR = 17.00, 95% CI 4.58-63.09, P YouTube video on skin bleaching was uploaded by an internet source. Videos made by television sources mentioned more information about skin bleaching being unsafe, while consumer-generated videos focused more on making skin-bleaching products at home. © 2017 British Association of Dermatologists.

  8. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--discrimination of ammonium nitrate sources.

    Science.gov (United States)

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Roux, Claude

    2009-06-01

    An evaluation was undertaken to determine if isotope ratio mass spectrometry (IRMS) could assist in the investigation of complex forensic cases by providing a level of discrimination not achievable utilising traditional forensic techniques. The focus of the research was on ammonium nitrate (AN), a common oxidiser used in improvised explosive mixtures. The potential value of IRMS to attribute Australian AN samples to the manufacturing source was demonstrated through the development of a preliminary AN classification scheme based on nitrogen isotopes. Although the discrimination utilising nitrogen isotopes alone was limited and only relevant to samples from the three Australian manufacturers during the evaluated time period, the classification scheme has potential as an investigative aid. Combining oxygen and hydrogen stable isotope values permitted the differentiation of AN prills from three different Australian manufacturers. Samples from five different overseas sources could be differentiated utilising a combination of the nitrogen, oxygen and hydrogen isotope values. Limited differentiation between Australian and overseas prills was achieved for the samples analysed. The comparison of nitrogen isotope values from intact AN prill samples with those from post-blast AN prill residues highlighted that the nitrogen isotopic composition of the prills was not maintained post-blast; hence, limiting the technique to analysis of un-reacted explosive material.

  9. Sources of tropospheric ozone along the Asian Pacific Rim: An analysis of ozonesonde observations

    Science.gov (United States)

    Liu, Hongyu; Jacob, Daniel J.; Chan, Lo Yin; Oltmans, Samuel J.; Bey, Isabelle; Yantosca, Robert M.; Harris, Joyce M.; Duncan, Bryan N.; Martin, Randall V.

    2002-11-01

    The sources contributing to tropospheric ozone over the Asian Pacific Rim in different seasons are quantified by analysis of Hong Kong and Japanese ozonesonde observations with a global three-dimensional (3-D) chemical transport model (GEOS-CHEM) driven by assimilated meteorological observations. Particular focus is placed on the extensive observations available from Hong Kong in 1996. In the middle-upper troposphere (MT-UT), maximum Asian pollution influence along the Pacific Rim occurs in summer, reflecting rapid convective transport of surface pollution. In the lower troposphere (LT) the season of maximum Asian pollution influence shifts to summer at midlatitudes from fall at low latitudes due to monsoonal influence. The UT ozone minimum and high variability observed over Hong Kong in winter reflects frequent tropical intrusions alternating with stratospheric intrusions. Asian biomass burning makes a major contribution to ozone at pollution influence (pollution influence exceeds European influence in the UT-MT, reflecting the uplift from convection and the warm conveyor belts over the eastern seaboard of North America. African outflow makes a major contribution to ozone in the low-latitude MT-UT over the Pacific Rim during November-April. Lightning influence over the Pacific Rim is minimum in summer due to westward UT transport at low latitudes associated with the Tibetan anticyclone. The Asian outflow flux of ozone to the Pacific is maximum in spring and fall and includes a major contribution from Asian anthropogenic sources year-round.

  10. Probing the heat sources during thermal runaway process by thermal analysis of different battery chemistries

    Science.gov (United States)

    Zheng, Siqi; Wang, Li; Feng, Xuning; He, Xiangming

    2018-02-01

    Safety issue is very important for the lithium ion battery used in electric vehicle or other applications. This paper probes the heat sources in the thermal runaway processes of lithium ion batteries composed of different chemistries using accelerating rate calorimetry (ARC) and differential scanning calorimetry (DSC). The adiabatic thermal runaway features for the 4 types of commercial lithium ion batteries are tested using ARC, whereas the reaction characteristics of the component materials, including the cathode, the anode and the separator, inside the 4 types of batteries are measured using DSC. The peaks and valleys of the critical component reactions measured by DSC can match the fluctuations in the temperature rise rate measured by ARC, therefore the relevance between the DSC curves and the ARC curves is utilized to probe the heat source in the thermal runaway process and reveal the thermal runaway mechanisms. The results and analysis indicate that internal short circuit is not the only way to thermal runaway, but can lead to extra electrical heat, which is comparable with the heat released by chemical reactions. The analytical approach of the thermal runaway mechanisms in this paper can guide the safety design of commercial lithium ion batteries.

  11. Experimental analysis of a diffusion absorption refrigeration system used alternative energy sources

    International Nuclear Information System (INIS)

    Soezen, A.; Oezbas, E.

    2009-01-01

    The continuous-cycle absorption refrigeration device is widely used in domestic refrigerators, and recreational vehicles. It is also used in year-around air conditioning of both homes and larger buildings. The unit consists of four main parts the boiler, condenser, evaporator and the absorber. When the unit operates on kerosene or gas, the heat is supplied by a burner. This element is fitted underneath the central tube. When operating on electricity, the heat is supplied by an element inserted in the pocket. No moving parts are employed. The operation of the refrigerating mechanism is based on Dalton's law. In this study, experimental analysis was performed of a diffusion absorption refrigeration system (DARS) used alternative energy sources such as solar, liquid petroleum gas (LPG) sources. Two basic DAR cycles were set up and investigated: i) In the first cycle (DARS-1), the condensate is sub-cooled prior to the evaporator entrance by the coupled evaporator/gas heat exchanger similar with manufactured by Electrolux Sweden. ii) In the second cycle (DARS-2), the condensate is not sub-cooled prior to the evaporator entrance and gas heat exchanger is separated from the evaporator. (author)

  12. Evaluation of Collateral Source Characteristics With 3-Dimensional Analysis Using Micro-X-Ray Computed Tomography.

    Science.gov (United States)

    Arima, Yuichiro; Hokimoto, Seiji; Tabata, Noriaki; Nakagawa, Osamu; Oshima, Asahi; Matsumoto, Yosuke; Sato, Takahiro; Mukunoki, Toshifumi; Otani, Jun; Ishii, Masanobu; Uchikawa, Michie; Yamamoto, Eiichiro; Izumiya, Yasuhiro; Kaikita, Koichi; Ogawa, Hisao; Nishiyama, Koichi; Tsujita, Kenichi

    2018-03-23

    Collateral arteries provide an alternative blood supply and protect tissues from ischemic damage in patients with peripheral artery disease. However, the mechanism of collateral artery development is difficult to validate. Collateral arteries were visualized using micro-x-ray computed tomography. Developmental characteristics were assessed using confocal microscopy. We conducted a single-center, retrospective, observational study and assessed the dilatation of collateral arteries on ischemic sides. We quantified the vascular volume in both ischemic and nonischemic legs. A prominent increase in vascular volume was observed in the ischemic leg using a murine hind-limb ischemia model. We also performed qualitative assessment and confirmed that the inferior gluteal artery functioned as a major collateral source. Serial analysis of murine hind-limb vessel development revealed that the inferior gluteal artery was a remnant of the ischial artery, which emerged as a representative vessel on the dorsal side during hind-limb organogenesis. We retrospectively analyzed consecutive patients who were admitted for the diagnosis or treatment of peripheral artery disease. The diameter of the inferior gluteal artery on the ischemic side showed significant dilatation compared with that on the nonischemic side. Our findings indicate that an embryonic remnant artery can become a collateral source under ischemic conditions. Flow enhancement in the inferior gluteal artery might become a novel therapeutic approach for patients with peripheral artery disease. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  13. Predictive networks: a flexible, open source, web application for integration and analysis of human gene networks.

    Science.gov (United States)

    Haibe-Kains, Benjamin; Olsen, Catharina; Djebbari, Amira; Bontempi, Gianluca; Correll, Mick; Bouton, Christopher; Quackenbush, John

    2012-01-01

    Genomics provided us with an unprecedented quantity of data on the genes that are activated or repressed in a wide range of phenotypes. We have increasingly come to recognize that defining the networks and pathways underlying these phenotypes requires both the integration of multiple data types and the development of advanced computational methods to infer relationships between the genes and to estimate the predictive power of the networks through which they interact. To address these issues we have developed Predictive Networks (PN), a flexible, open-source, web-based application and data services framework that enables the integration, navigation, visualization and analysis of gene interaction networks. The primary goal of PN is to allow biomedical researchers to evaluate experimentally derived gene lists in the context of large-scale gene interaction networks. The PN analytical pipeline involves two key steps. The first is the collection of a comprehensive set of known gene interactions derived from a variety of publicly available sources. The second is to use these 'known' interactions together with gene expression data to infer robust gene networks. The PN web application is accessible from http://predictivenetworks.org. The PN code base is freely available at https://sourceforge.net/projects/predictivenets/.

  14. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  15. Null stream analysis of Pulsar Timing Array data: localisation of resolvable gravitational wave sources

    Science.gov (United States)

    Goldstein, Janna; Veitch, John; Sesana, Alberto; Vecchio, Alberto

    2018-04-01

    Super-massive black hole binaries are expected to produce a gravitational wave (GW) signal in the nano-Hertz frequency band which may be detected by pulsar timing arrays (PTAs) in the coming years. The signal is composed of both stochastic and individually resolvable components. Here we develop a generic Bayesian method for the analysis of resolvable sources based on the construction of `null-streams' which cancel the part of the signal held in common for each pulsar (the Earth-term). For an array of N pulsars there are N - 2 independent null-streams that cancel the GW signal from a particular sky location. This method is applied to the localisation of quasi-circular binaries undergoing adiabatic inspiral. We carry out a systematic investigation of the scaling of the localisation accuracy with signal strength and number of pulsars in the PTA. Additionally, we find that source sky localisation with the International PTA data release one is vastly superior than what is achieved by its constituent regional PTAs.

  16. Derivation and analysis of the Feynman-alpha formula for deterministically pulsed sources

    International Nuclear Information System (INIS)

    Wright, J.; Pazsit, I.

    2004-03-01

    The purpose or this report is to give a detailed description of the calculation of the Feynman-alpha formula with deterministically pulsed sources. In contrast to previous calculations, Laplace transform and complex function methods are used to arrive at a compact solution in form of a Fourier series-like expansion. The advantage of this method is that it is capable to treat various pulse shapes. In particular, in addition to square- and Dirac delta pulses, a more realistic Gauss-shaped pulse is also considered here. The final solution of the modified variance-to-mean, that is the Feynman Y(t) function, can be quantitatively evaluated fast and with little computational effort. The analytical solutions obtained are then analysed quantitatively. The behaviour of the number or neutrons in the system is investigated in detail, together with the transient that follows the switching on of the source. An analysis of the behaviour of the Feynman Y(t) function was made with respect to the pulse width and repetition frequency. Lastly, the possibility of using me formulae for the extraction of the parameter alpha from a simulated measurement is also investigated

  17. Advanced neutron source reactor conceptual safety analysis report, three-element-core design: Chapter 15, accident analysis

    International Nuclear Information System (INIS)

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L.; Harrington, R.M.

    1996-02-01

    In order to utilize reduced enrichment fuel, the three-element-core design for the Advanced Neutron Source has been proposed. The proposed core configuration consists of inner, middle, and outer elements, with the middle element offset axially beneath the inner and outer elements, which are axially aligned. The three-element-core RELAP5 model assumes that the reactor hardware is changed only within the core region, so that the loop piping, heat exchangers, and pumps remain as assumed for the two-element-core configuration. To assess the impact of changes in the core region configuration and the thermal-hydraulic steady-state conditions, the safety analysis has been updated. This report gives the safety margins for the loss-of-off-site power and pressure-boundary fault accidents based on the RELAP5 results. AU margins are greater for the three-element-core simulations than those calculated for the two-element core

  18. Who bears the environmental burden in China? An analysis of the distribution of industrial pollution sources

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Chunbo [School of Agricultural and Resource Economics, University of Western Australia, 35 Stirling Highway, Crawley, 6009, Western Australia (Australia)

    2010-07-15

    A remaining challenge for environmental inequality researchers is to translate the principles developed in the U.S. to China which is experiencing the staggering environmental impacts of its astounding economic growth and social changes. This study builds on U.S. contemporary environmental justice literature and examines the issue of environmental inequality in China through an analysis of the geographical distribution of industrial pollution sources in Henan province. This study attempts to answer two central questions: (1) whether environmental inequality exists in China and if it does, (2) what socioeconomic lenses can be used to identify environmental inequality. The study found that: (1) race and income - the two common lenses used in many U.S. studies play different roles in the Chinese context; (2) rural residents and especially rural migrants are disproportionately exposed to industrial pollution. (author)

  19. Conceptual design loss-of-coolant accident analysis for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L. Jr.

    1994-01-01

    A RELAP5 system model for the Advanced Neutron Source Reactor has been developed for performing conceptual safety analysis report calculations. To better represent thermal-hydraulic behavior of the core, three specific changes in the RELAP5 computer code were implemented: a turbulent forced-convection heat transfer correlation, a critical heat flux (CHF) correlation, and an interfacial drag correlation. The model consists of the core region, the heat exchanger loop region, and the pressurizing/letdown system region. Results for three loss-of-coolant accident analyses are presented: (1) an instantaneous double-ended guillotine (DEG) core outlet break with a cavitating venturi installed downstream of the core, (b) a core pressure boundary tube outer wall rupture, and (c) a DEG core inlet break with a finite break-formation time. The results show that the core can survive without exceeding the flow excursion of CHF thermal limits at a 95% probability level if the proper mitigation options are provided

  20. Analysis and Optimal Condition of the Rear-Sound-Aided Control Source in Active Noise Control

    Directory of Open Access Journals (Sweden)

    Karel Kreuter

    2011-01-01

    Full Text Available An active noise control scenario of simple ducts is considered. The previously suggested technique of using an single loudspeaker and its rear sound to cancel the upstream sound is further examined and compared to the bidirectional solution in order to give theoretical proof of its advantage. Firstly, a model with a new approach for taking damping effects into account is derived based on the electrical transmission line theory. By comparison with the old model, the new approach is validated, and occurring differences are discussed. Moreover, a numerical application with the consideration of damping is implemented for confirmation. The influence of the rear sound strength on the feedback-path system is investigated, and the optimal condition is determined. Finally, it is proven that the proposed source has an advantage of an extended phase lag and a time delay in the feedback-path system by both frequency-response analysis and numerical calculation of the time response.

  1. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Science.gov (United States)

    Mavandadi, Sam; Dimitrov, Stoyan; Feng, Steve; Yu, Frank; Sikora, Uzair; Yaglidere, Oguzhan; Padmanabhan, Swati; Nielsen, Karin; Ozcan, Aydogan

    2012-01-01

    In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected), with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  2. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  3. Entropy Generation Analysis of Natural Convection in Square Enclosures with Two Isoflux Heat Sources

    Directory of Open Access Journals (Sweden)

    S. Z. Nejad

    2017-04-01

    Full Text Available This study investigates entropy generation resulting from natural convective heat transfer in square enclosures with local heating of the bottom and symmetrical cooling of the sidewalls. This analysis tends to optimize heat transfer of two pieces of semiconductor in a square electronic package. In this simulation, heaters are modeled as isoflux heat sources and sidewalls of the enclosure are isothermal heat sinks. The top wall and the non-heated portions of the bottom wall are adiabatic. Flow and temperature fields are obtained by numerical simulation of conservation equations of mass, momentum and energy in laminar, steady and two dimensional flows. With constant heat energy into the cavity, effect of Rayleigh number, heater length, heater strength ratios and heater position is evaluated on flow and temperature fields and local entropy generation. The results show that a minimum entropy generation rate is obtained under the same condition in which a minimum peak heater temperature is obtained.

  4. Review of single particle dynamics for third generation light sources through frequency map analysis

    Directory of Open Access Journals (Sweden)

    L. Nadolski

    2003-11-01

    Full Text Available Frequency map analysis [J. Laskar, Icarus 88, 266 (1990] is used here to analyze the transverse dynamics of four third generation synchrotron light sources: the ALS, the ESRF, the SOLEIL project, and Super-ACO. Time variations of the betatron tunes give additional information for the global dynamics of the beam. The main resonances are revealed; a one-to-one correspondence between the configuration space and the frequency space can be performed. We stress that the frequency maps, and therefore the dynamics optimization, are highly sensitive to sextupolar strengths and vary in a large amount from one machine to another. The frequency maps can thus be used to characterize the different machines.

  5. Distributed medical image analysis and diagnosis through crowd-sourced games: a malaria case study.

    Directory of Open Access Journals (Sweden)

    Sam Mavandadi

    Full Text Available In this work we investigate whether the innate visual recognition and learning capabilities of untrained humans can be used in conducting reliable microscopic analysis of biomedical samples toward diagnosis. For this purpose, we designed entertaining digital games that are interfaced with artificial learning and processing back-ends to demonstrate that in the case of binary medical diagnostics decisions (e.g., infected vs. uninfected, with the use of crowd-sourced games it is possible to approach the accuracy of medical experts in making such diagnoses. Specifically, using non-expert gamers we report diagnosis of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made by a trained medical professional.

  6. Numerical analysis of the beam position monitor pickup for the Iranian light source facility

    Energy Technology Data Exchange (ETDEWEB)

    Shafiee, M., E-mail: mehdish@ipm.ir [Radiation Applications Department, Shahid Beheshti University, G. C., Tehran (Iran, Islamic Republic of); Feghhi, S.A.H. [Radiation Applications Department, Shahid Beheshti University, G. C., Tehran (Iran, Islamic Republic of); Rahighi, J. [Iranian Light Source Facility (ILSF), Institute for Research in Fundamental Sciences (IPM), Tehran (Iran, Islamic Republic of)

    2017-03-01

    In this paper, we describe the design of a button type Beam Position Monitor (BPM) for the low emittance storage ring of the Iranian Light Source Facility (ILSF). First, we calculate sensitivities, induced power and intrinsic resolution based on solving Laplace equation numerically by finite element method (FEM), in order to find the potential at each point of BPM's electrode surface. After the optimization of the designed BPM, trapped high order modes (HOM), wakefield and thermal loss effects are calculated. Finally, after fabrication of BPM, it is experimentally tested by using a test-stand. The results depict that the designed BPM has a linear response in the area of 2×4 mm{sup 2} inside the beam pipe and the sensitivity of 0.080 and 0.087 mm{sup −1} in horizontal and vertical directions. Experimental results also depict that they are in a good agreement with numerical analysis.

  7. Utilization of the intense pulsed neutron source (IPNS) at Argonne National Laboratory for neutron activation analysis

    International Nuclear Information System (INIS)

    Heinrich, R.R.; Greenwood, L.R.; Popek, R.J.; Schulke, A.W. Jr.

    1983-01-01

    The Intense Pulsed Neutron Source (IPNS) neutron scattering facility (NSF) has been investigated for its applicability to neutron activation analysis. A polyethylene insert has been added to the vertical hole VT3 which enhances the thermal neutron flux by a factor of two. The neutron spectral distribution at this position has been measured by the multiple-foil technique which utilized 28 activation reactions and the STAYSL computer code. The validity of this spectral measurement was tested by two irradiations of National Bureau of Standards SRM-1571 (orchard leaves), SRM-1575 (pine needles), and SRM-1645 (river sediment). The average thermal neutron flux for these irradiations normalized to 10 μamp proton beam is 4.0 x 10 11 n/cm 2 -s. Concentrations of nine trace elements in each of these SRMs have been determined by gamma-ray spectrometry. Agreement of measured values to certified values is demonstrated to be within experiment error

  8. A systematic analysis of the Braitenberg vehicle 2b for point-like stimulus sources

    International Nuclear Information System (INIS)

    Rañó, Iñaki

    2012-01-01

    Braitenberg vehicles have been used experimentally for decades in robotics with limited empirical understanding. This paper presents the first mathematical model of the vehicle 2b, displaying so-called aggression behaviour, and analyses the possible trajectories for point-like smooth stimulus sources. This sensory-motor steering control mechanism is used to implement biologically grounded target approach, target-seeking or obstacle-avoidance behaviour. However, the analysis of the resulting model reveals that complex and unexpected trajectories can result even for point-like stimuli. We also prove how the implementation of the controller and the vehicle morphology interact to affect the behaviour of the vehicle. This work provides a better understanding of Braitenberg vehicle 2b, explains experimental results and paves the way for a formally grounded application on robotics as well as for a new way of understanding target seeking in biology. (paper)

  9. Variability search in M 31 using principal component analysis and the Hubble Source Catalogue

    Science.gov (United States)

    Moretti, M. I.; Hatzidimitriou, D.; Karampelas, A.; Sokolovsky, K. V.; Bonanos, A. Z.; Gavras, P.; Yang, M.

    2018-06-01

    Principal component analysis (PCA) is being extensively used in Astronomy but not yet exhaustively exploited for variability search. The aim of this work is to investigate the effectiveness of using the PCA as a method to search for variable stars in large photometric data sets. We apply PCA to variability indices computed for light curves of 18 152 stars in three fields in M 31 extracted from the Hubble Source Catalogue. The projection of the data into the principal components is used as a stellar variability detection and classification tool, capable of distinguishing between RR Lyrae stars, long-period variables (LPVs) and non-variables. This projection recovered more than 90 per cent of the known variables and revealed 38 previously unknown variable stars (about 30 per cent more), all LPVs except for one object of uncertain variability type. We conclude that this methodology can indeed successfully identify candidate variable stars.

  10. Analysis of Nonlinear Dispersion of a Pollutant Ejected by an External Source into a Channel Flow

    Directory of Open Access Journals (Sweden)

    T. Chinyoka

    2010-01-01

    Full Text Available This paper focuses on the transient analysis of nonlinear dispersion of a pollutant ejected by an external source into a laminar flow of an incompressible fluid in a channel. The influence of density variation with pollutant concentration is approximated according to the Boussinesq approximation, and the nonlinear governing equations of momentum and pollutant concentration are obtained. The problem is solved numerically using a semi-implicit finite difference method. Solutions are presented in graphical form and given in terms of fluid velocity, pollutant concentration, skin friction, and wall mass transfer rate for various parametric values. The model can be a useful tool for understanding the polluting situations of an improper discharge incident and evaluating the effects of decontaminating measures for the water body.

  11. Analysis of particle sources by interferometry in a three-body final state

    International Nuclear Information System (INIS)

    Humbert, P.

    1984-01-01

    This work presents the set-up of an original interferometrical method the aim of which is to access the intrinsic parameters (lifetime or natural width) of intermediate resonances created during nuclear collisions. The technic is based on the overlap of two events in the same detection, and shows some analogies with the interferometrical measurements based on the HANBURY-BROWN, TWISS effect. It applies to reactions leading to a three particle final state for which at least two particles are identical. The considered reactions are 11 B(α, 7 Li)αα; 12 C( 16 0,α) 12 C 12 C, 11 B(p,α)αα in which the intermediate source is respectively a level of 11 B*, 16 0*, 8 Be*. The results are in qualitative agreement with such an analysis [fr

  12. Force analysis of the advanced neutron source control rod drive latch mechanism

    International Nuclear Information System (INIS)

    Damiano, B.

    1989-01-01

    The Advanced Neutron Source reactor (ANS), a proposed Department of Energy research reactor currently undergoing conceptual design at the Oak Ridge National Laboratory (ORNL), will generate a thermal neutron flux approximating 10 30 M -2 emdash S -1 . The compact core necessary to produce this flux provides little space for the shim safety control rods, which are located in the central annulus of the core. Without proper control rod drive design, the control rod drive magnets (which hold the control rod latch in a ready-to-scram position) may be unable to support the required load due to their restricted size. This paper describes the force analysis performed on the control rod latch mechanism to determine the fraction of control rod weight transferred to the drive magnet. This information will be useful during latch, control rod drive and magnet design. 5 refs., 12 figs

  13. Physics of the 252Cf-source-driven noise analysis measurement

    International Nuclear Information System (INIS)

    Valentine, T.E.; Mihalczo, J.T.; Perez, R.B.; Mattingly, J.K.

    1997-01-01

    The 252 Cf-source-driven noise analysis method is a versatile measurements tool that has been applied to measurements for initial loading of reactors, quality assurance of reactor fuel elements, fuel processing facilities, fuel reprocessing facilities, fuel storage facilities, zero-power testing of reactors, verification of calculational methods, process monitoring, characterization of storage vaults, and nuclear weapons identification. This method's broad range of application is due to the wide variety of time- and frequency domain signatures, each with unique properties, obtained from the measurement. The following parameters are obtained from this measurement: average detector count rates, detector multiplicities, detector autocorrelations, cross-correlation between detectors, detector autopower spectral densities, cross-power spectral densities between detectors, coherences, and ratios of spectral densities. All of these measured parameters can also be calculated using the MCNP-DSP Monte Carlo code. This paper presents a review of the time-domain signatures obtained from this measurement

  14. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    Science.gov (United States)

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  15. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    Science.gov (United States)

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery

  16. Source apportionment of elevated wintertime PAHs by compound-specific radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    R. J. Sheesley

    2009-05-01

    Full Text Available Natural abundance radiocarbon analysis facilitates distinct source apportionment between contemporary biomass/biofuel (14C "alive" versus fossil fuel (14C "dead" combustion. Here, the first compound-specific radiocarbon analysis (CSRA of atmospheric polycyclic aromatic hydrocarbons (PAHs was demonstrated for a set of samples collected in Lycksele, Sweden a small town with frequent episodes of severe atmospheric pollution in the winter. Renewed interest in using residential wood combustion (RWC means that this type of seasonal pollution is of increasing concern in many areas. Five individual/paired PAH isolates from three pooled fortnight-long filter collections were analyzed by CSRA: phenanthrene, fluoranthene, pyrene, benzo[b+k]fluoranthene and indeno[cd]pyrene plus benzo[ghi]perylene; phenanthrene was the only compound also analyzed in the gas phase. The measured Δ14C for PAHs spanned from −138.3‰ to 58.0‰. A simple isotopic mass balance model was applied to estimate the fraction biomass (fbiomass contribution, which was constrained to 71–87% for the individual PAHs. Indeno[cd]pyrene plus benzo[ghi]perylene had an fbiomass of 71%, while fluoranthene and phenanthrene (gas phase had the highest biomass contribution at 87%. The total organic carbon (TOC, defined as carbon remaining after removal of inorganic carbon fbiomass was estimated to be 77%, which falls within the range for PAHs. This CSRA data of atmospheric PAHs established that RWC is the dominating source of atmospheric PAHs to this region of the boreal zone with some variations among RWC contributions to specific PAHs.

  17. Validation of botanical origins and geographical sources of some Saudi honeys using ultraviolet spectroscopy and chemometric analysis.

    Science.gov (United States)

    Ansari, Mohammad Javed; Al-Ghamdi, Ahmad; Khan, Khalid Ali; Adgaba, Nuru; El-Ahmady, Sherweit H; Gad, Haidy A; Roshan, Abdulrahman; Meo, Sultan Ayoub; Kolyali, Sevgi

    2018-02-01

    This study aims at distinguishing honey based on botanical and geographical sources. Different floral honey samples were collected from diverse geographical locations of Saudi Arabia. UV spectroscopy in combination with chemometric analysis including Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), and Soft Independent Modeling of Class Analogy (SIMCA) were used to classify honey samples. HCA and PCA presented the initial clustering pattern to differentiate between botanical as well as geographical sources. The SIMCA model clearly separated the Ziziphus sp. and other monofloral honey samples based on different locations and botanical sources. The results successfully discriminated the honey samples of different botanical and geographical sources validating the segregation observed using few physicochemical parameters that are regularly used for discrimination.

  18. A tsunami wave propagation analysis for the Ulchin Nuclear Power Plant considering the tsunami sources of western part of Japan

    International Nuclear Information System (INIS)

    Rhee, Hyun Me; Kim, Min Kyu; Sheen, Dong Hoon; Choi, In Kil

    2013-01-01

    The accident which was caused by a tsunami and the Great East-Japan earthquake in 2011 occurred at the Fukushima Nuclear Power Plant (NPP) site. It is obvious that the NPP accident could be incurred by the tsunami. Therefore a Probabilistic Tsunami Hazard Analysis (PTHA) for an NPP site should be required in Korea. The PTHA methodology is developed on the PSHA (Probabilistic Seismic Hazard Analysis) method which is performed by using various tsunami sources and their weights. In this study, the fault sources of northwestern part of Japan were used to analyze as the tsunami sources. These fault sources were suggested by the Atomic Energy Society of Japan (AESJ). To perform the PTHA, the calculations of maximum and minimum wave elevations from the result of tsunami simulations are required. Thus, in this study, tsunami wave propagation analysis were performed for developing the future study of the PTHA

  19. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2011-12-01

    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website (www.emsc-csem.org) is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  20. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  1. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    Science.gov (United States)

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  2. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  3. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  4. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  5. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    Science.gov (United States)

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  6. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Activation analysis of stainless steel flux monitors using 252Cf neutron sources

    International Nuclear Information System (INIS)

    Williams, J.G.; Newton, T.H. Jr.; Cogburn, C.O.

    1984-01-01

    Activation analysis was performed on stainless steel beads from a chain which is used in reactor pressure vessel surveillance experiments at the Arkansas Power and Light Company reactors. The beads allow monitoring of two fast and three thermal neutron induced reactions: 58 Ni(n,p) 58 Co, 54 Fe(n,p) 54 Mn, 58 Fe(n,γ) 59 Fe, 59 Co(n,γ) 60 Co and 50 Cr(n,γ) 51 Cr. The analysis was performed using 12 beads from various positions along 5 different batches of chain and standard materials in an H 2 O moderator tank using two intense californium sources which had a total neutron emission rate of 3.97 x 10 10 /s. Semiconductor gamma spectrometers were used to count the products of the above reactions in the specimens. The percentage by weight of the iron, chromium and cobalt in the beads were found to be 62.1%, 20.2% and 0.120%, respectively. The excellent uniformity found in the bead compositions demonstrates the reproducibility of the experimental techniques and enhances considerably the value of the beads as neutron flux montitors

  8. Pre-2014 mudslides at Oso revealed by InSAR and multi-source DEM analysis

    Science.gov (United States)

    Kim, J. W.; Lu, Z.; QU, F.

    2014-12-01

    The landslide is a process that results in the downward and outward movement of slope-reshaping materials including rocks and soils and annually causes the loss of approximately $3.5 billion and tens of casualties in the United States. The 2014 Oso mudslide was an extreme event costing nearly 40 deaths and damaging civilian properties. Landslides are often unpredictable, but in many cases, catastrophic events are repetitive. Historic record in the Oso mudslide site indicates that there have been serial events in decades, though the extent of sliding events varied from time to time. In our study, the combination of multi-source DEMs, InSAR, and time-series InSAR analysis has enabled to characterize the Oso mudslide. InSAR results from ALOS PALSAR show that there was no significant deformation between mid-2006 and 2011. The combination of time-series InSAR analysis and old-dated DEM indicated revealed topographic changes associated the 2006 sliding event, which is confirmed by the difference of multiple LiDAR DEMs. Precipitation and discharge measurements before the 2006 and 2014 landslide events did not exhibit extremely anomalous records, suggesting the precipitation is not the controlling factor in determining the sliding events at Oso. The lack of surface deformation during 2006-2011 and weak correlation between the precipitation and the sliding event, suggest other factors (such as porosity) might play a critical role on the run-away events at this Oso and other similar landslides.

  9. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  10. Steady-state thermal-hydraulic design analysis of the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Yoder, G.L. Jr.; Dixon, J.R.; Elkassabgi, Y.; Felde, D.K.; Giles, G.E.; Harrington, R.M.; Morris, D.G.; Nelson, W.R.; Ruggles, A.E.; Siman-Tov, M.; Stovall, T.K.

    1994-05-01

    The Advanced Neutron Source (ANS) is a research reactor that is planned for construction at Oak Ridge National Laboratory. This reactor will be a user facility with the major objective of providing the highest continuous neutron beam intensities of any reactor in the world. Additional objectives for the facility include providing materials irradiation facilities and isotope production facilities as good as, or better than, those in the High Flux Isotope Reactor. To achieve these objectives, the reactor design uses highly subcooled heavy water as both coolant and moderator. Two separate core halves of 67.6-L total volume operate at an average power density of 4.5 MW(t)/L, and the coolant flows upward through the core at 25 m/s. Operating pressure is 3.1 MPa at the core inlet with a 1.4-MPa pressure drop through the core region. Finally, in order to make the resources available for experimentation, the fuel is designed to provide a 17-d fuel cycle with an additional 4 d planned in each cycle for the refueling process. This report examines the codes and models used to develop the thermal-hydraulic design for ANS, as well as the correlations and physical data; evaluates thermal-hydraulic uncertainties; reports on thermal-hydraulic design and safety analysis; describes experimentation in support of the ANS reactor design and safety analysis; and provides an overview of the experimental plan

  11. Sequence-based analysis of the microbial composition of water kefir from multiple sources.

    Science.gov (United States)

    Marsh, Alan J; O'Sullivan, Orla; Hill, Colin; Ross, R Paul; Cotter, Paul D

    2013-11-01

    Water kefir is a water-sucrose-based beverage, fermented by a symbiosis of bacteria and yeast to produce a final product that is lightly carbonated, acidic and that has a low alcohol percentage. The microorganisms present in water kefir are introduced via water kefir grains, which consist of a polysaccharide matrix in which the microorganisms are embedded. We aimed to provide a comprehensive sequencing-based analysis of the bacterial population of water kefir beverages and grains, while providing an initial insight into the corresponding fungal population. To facilitate this objective, four water kefirs were sourced from the UK, Canada and the United States. Culture-independent, high-throughput, sequencing-based analyses revealed that the bacterial fraction of each water kefir and grain was dominated by Zymomonas, an ethanol-producing bacterium, which has not previously been detected at such a scale. The other genera detected were representatives of the lactic acid bacteria and acetic acid bacteria. Our analysis of the fungal component established that it was comprised of the genera Dekkera, Hanseniaspora, Saccharomyces, Zygosaccharomyces, Torulaspora and Lachancea. This information will assist in the ultimate identification of the microorganisms responsible for the potentially health-promoting attributes of these beverages. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  12. Manganese determination om minerals by activation analysis, using the californium-252 as a neutron source

    International Nuclear Information System (INIS)

    Cardoso, Antonio

    1976-01-01

    Neutron Activation Analysis, using a Californium-252 neutron source, has been applied for the determination of manganese in ores such as pyrolusite, rodonite (manganese silicate)' and blending used in dry-batteries The favorable nuclear properties of manganese, such as high thermal neutron cross-section for the reaction 55 Mn (n.gamma) 56 Mn, high concentration of manganese in the matrix and short half - life of 56 Mn, are an ideal combination for non-destructive analysis of manganese in ores. Samples and standards of manganese dioxide were irradiated for about 20 minutes, followed by a 4 to 15 minutes decay and counted in a single channel pulse-height discrimination using a NaI(Tl) scintillation detector. Counting time was equal to 10 minutes. The interference of nuclear reactions 56 Fe(n,p) 56 Mn and 59 Co (n, α) 56 were studied, as well as problems in connection with neutron shadowing during irradiation, gamma-rays attenuation during counting and influence of granulometry of samples. One sample,was also analysed by wet-chemical method (sodium bismuthate) in order to compare results. As a whole, i t was shown that the analytical method of neutron activation for manganese in ores and blending, is a method simple, rapid and with good precision and accuracy. (author)

  13. Neutron fluctuation analysis in a subcritical multiplying system with a stochastically pulsed poisson source

    International Nuclear Information System (INIS)

    Kostic, Lj.

    2003-01-01

    The influence of the stochastically pulsed Poisson source to the statistical properties of the subcritical multiplying system is analyzed in the paper. It is shown a strong dependence on the pulse period and pulse width of the source (author)

  14. Analysis of the reasons of recently some radioactive source accidents and suggestions for management countermeasures

    International Nuclear Information System (INIS)

    Su Yongjie; Feng Youcai; Song Chenxiu; Gao Huibin; Xing Jinsong; Pang Xinxin; Wang Xiaoqing; Wei Hong

    2007-01-01

    The article introduces recently some radioactive source accidents in China, and analyses the reasons of the accidents. Some important issues existed in the process of implementing new regulation were summarized, and some suggestions for managing radioactive sources are made. (authors)

  15. Detailed budget analysis of HONO in central London reveals a missing daytime source

    Directory of Open Access Journals (Sweden)

    J. D. Lee

    2016-03-01

    Full Text Available Measurements of HONO were carried out at an urban background site near central London as part of the Clean air for London (ClearfLo project in summer 2012. Data were collected from 22 July to 18 August 2014, with peak values of up to 1.8 ppbV at night and non-zero values of between 0.2 and 0.6 ppbV seen during the day. A wide range of other gas phase, aerosol, radiation, and meteorological measurements were made concurrently at the same site, allowing a detailed analysis of the chemistry to be carried out. The peak HONO/NOx ratio of 0.04 is seen at  ∼  02:00 UTC, with the presence of a second, daytime, peak in HONO/NOx of similar magnitude to the night-time peak, suggesting a significant secondary daytime HONO source. A photostationary state calculation of HONO involving formation from the reaction of OH and NO and loss from photolysis, reaction with OH, and dry deposition shows a significant underestimation during the day, with calculated values being close to 0, compared to the measurement average of 0.4 ppbV at midday. The addition of further HONO sources from the literature, including dark conversion of NO2 on surfaces, direct emission, photolysis of ortho-substituted nitrophenols, the postulated formation from the reaction of HO2 ×  H2O with NO2, photolysis of adsorbed HNO3 on ground and aerosols, and HONO produced by photosensitized conversion of NO2 on the surface increases the daytime modelled HONO to 0.1 ppbV, still leaving a significant missing daytime source. The missing HONO is plotted against a series of parameters including NO2 and OH reactivity (used as a proxy for organic material, with little correlation seen. Much better correlation is observed with the product of these species with j(NO2, in particular NO2 and the product of NO2 with OH reactivity. This suggests the missing HONO source is in some way related to NO2 and also requires sunlight. Increasing the photosensitized surface conversion rate of NO2 by a

  16. MorphoTester: An Open Source Application for Morphological Topographic Analysis.

    Directory of Open Access Journals (Sweden)

    Julia M Winchester

    Full Text Available The increased prevalence and affordability of 3D scanning technology is beginning to have significant effects on the research questions and approaches available for studies of morphology. As the current trend of larger and more precise 3D datasets is unlikely to slow in the future, there is a need for efficient and capable tools for high-throughput quantitative analysis of biological shape. The promise and the challenge of implementing relatively automated methods for characterizing surface shape can be seen in the example of dental topographic analysis. Dental topographic analysis comprises a suite of techniques for quantifying tooth surfaces and component features. Topographic techniques have provided insight on mammalian molar form-function relationships and these methods could be applied to address other topics and questions. At the same time implementing multiple complementary topographic methods can have high time and labor costs, and comparability of data formats and approaches is difficult to predict. To address these challenges I present MorphoTester, an open source application for visualizing and quantifying topography from 3D triangulated polygon meshes. This application is Python-based and is free to use. MorphoTester implements three commonly used dental topographic metrics-Dirichlet normal energy, relief index, and orientation patch count rotated (OPCR. Previous OPCR algorithms have used raster-based grid data, which is not directly interchangeable with vector-based triangulated polygon meshes. A 3D-OPCR algorithm is provided here for quantifying complexity from polygon meshes. The efficacy of this metric is tested in a sample of mandibular second molars belonging to four species of cercopithecoid primates. Results suggest that 3D-OPCR is at least as effective for quantifying complexity as previous approaches, and may be more effective due to finer resolution of surface data considered here. MorphoTester represents an advancement

  17. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited.

    Science.gov (United States)

    Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald

    2016-01-14

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.

  18. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited

    International Nuclear Information System (INIS)

    Zöllner, Frank G.; Daab, Markus; Sourbron, Steven P.; Schad, Lothar R.; Schoenberg, Stefan O.; Weisser, Gerald

    2016-01-01

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft’s model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control

  19. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    Lucia, Silvio Rogerio de; Maihara, Vera Akiko; Menezes, Mario O. de

    2009-01-01

    In this work, a new software - SAANI (Instrumental Neutron Activation Analysis Software) was developed and used for gamma ray spectra analysis in the Neutron Activation Laboratory (LAN) of the Nuclear and Energetic Research Institute (IPEN-CNEN/SP). The software was developed to completely replace the old one - VISPECT. Besides the visual improvement in the user interface, the new software will allow the standardization of several procedures which are done nowadays in several different ways by each researcher, avoiding intermediate steps in the calculations. By using a modern programming language - Python, together with the graphical library Qt (by Trolltech), both multi-platform, the new software is able to run in Windows, Linux and other platforms. In addition to this, the new software has being designed to be extensible through plug-ins. In order to achieve the proposed initial scope, that is, completely replace the old software, SAANI has undergone several and different kinds of tests, using spectra from certified reference materials, standards and common spectra already analyzed by other software or that were used in international inter-comparisons. The results obtained by SAANI in all tests were considered very good. Some small discrepancies were found and after careful search and analysis, their source was identified as being an accuracy bug in the old software. Usability and robustness tests were conducted by installing SAANI in several laboratory computers and following them during daily utilization. The results of these tests also indicated that SAANI was ready to be used by all researchers in the LAN-IPEN. (author)

  20. ML-Ask: Open Source Affect Analysis Software for Textual Input in Japanese

    Directory of Open Access Journals (Sweden)

    Michal Ptaszynski

    2017-06-01

    Full Text Available We present ML-Ask – the first Open Source Affect Analysis system for textual input in Japanese. ML-Ask analyses the contents of an input (e.g., a sentence and annotates it with information regarding the contained general emotive expressions, specific emotional words, valence-activation dimensions of overall expressed affect, and particular emotion types expressed with their respective expressions. ML-Ask also incorporates the Contextual Valence Shifters model for handling negation in sentences to deal with grammatically expressible shifts in the conveyed valence. The system, designed to work mainly under Linux and MacOS, can be used for research on, or applying the techniques of Affect Analysis within the framework Japanese language. It can also be used as an experimental baseline for specific research in Affect Analysis, and as a practical tool for written contents annotation.   Funding statement: This research has been supported by: a Research Grant from the Nissan Science Foundation (years 2009–2010, The GCOE Program founded by Japan’s Ministry of Education, Culture, Sports, Science and Technology (years 2009–2010, (JSPS KAKENHI Grant-in-Aid for JSPS Fellows (Project Number: 22-00358 (years 2010–2012, (JSPS KAKENHI Grant-in-Aid for Scientific Research (Project Number: 24600001 (years 2012–2015, (JSPS KAKENHI Grant-in-Aid for Research Activity Start-up (Project Number: 25880003 (years 2013–2015, and (JSPS KAKENHI Grant-in-Aid for Encouragement of Young Scientists (B (Project Number: 15K16044 (years 2015-present, project estimated to end in March 2018.

  1. Evaluating laser-driven Bremsstrahlung radiation sources for imaging and analysis of nuclear waste packages

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Christopher P., E-mail: cj0810@bristol.ac.uk [Interface Analysis Centre, HH Wills Physics Laboratory, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Brenner, Ceri M. [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Stitt, Camilla A. [Interface Analysis Centre, HH Wills Physics Laboratory, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Armstrong, Chris; Rusby, Dean R. [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Department of Physics, SUPA, University of Strathclyde, Glasgow G4 0NG (United Kingdom); Mirfayzi, Seyed R. [Centre for Plasma Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Wilson, Lucy A. [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Alejo, Aarón; Ahmed, Hamad [Centre for Plasma Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Allott, Ric [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Butler, Nicholas M.H. [Department of Physics, SUPA, University of Strathclyde, Glasgow G4 0NG (United Kingdom); Clarke, Robert J.; Haddock, David; Hernandez-Gomez, Cristina [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Higginson, Adam [Department of Physics, SUPA, University of Strathclyde, Glasgow G4 0NG (United Kingdom); Murphy, Christopher [Department of Physics, University of York, York YO10 5DD (United Kingdom); Notley, Margaret [Central Laser Facility, STFC, Rutherford Appleton Laboratory, Didcot, Oxon OX11 0QX (United Kingdom); Paraskevoulakos, Charilaos [Interface Analysis Centre, HH Wills Physics Laboratory, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Jowsey, John [Ground Floor North B582, Sellafield Ltd, Seascale, Cumbria CA20 1PG (United Kingdom); and others

    2016-11-15

    Highlights: • X-ray generation was achieved via laser interaction with a tantalum thin foil target. • Picosecond X-ray pulse from a sub-mm spot generated high resolution images. • MeV X-ray emission is possible, permitting analysis of full scale waste containers. • In parallel neutron emission of 10{sup 7}–10{sup 9} neutrons per steradian per pulse was attained. • Development of a 10 Hz diode pumped laser system for waste monitoring is envisioned. - Abstract: A small scale sample nuclear waste package, consisting of a 28 mm diameter uranium penny encased in grout, was imaged by absorption contrast radiography using a single pulse exposure from an X-ray source driven by a high-power laser. The Vulcan laser was used to deliver a focused pulse of photons to a tantalum foil, in order to generate a bright burst of highly penetrating X-rays (with energy >500 keV), with a source size of <0.5 mm. BAS-TR and BAS-SR image plates were used for image capture, alongside a newly developed Thalium doped Caesium Iodide scintillator-based detector coupled to CCD chips. The uranium penny was clearly resolved to sub-mm accuracy over a 30 cm{sup 2} scan area from a single shot acquisition. In addition, neutron generation was demonstrated in situ with the X-ray beam, with a single shot, thus demonstrating the potential for multi-modal criticality testing of waste materials. This feasibility study successfully demonstrated non-destructive radiography of encapsulated, high density, nuclear material. With recent developments of high-power laser systems, to 10 Hz operation, a laser-driven multi-modal beamline for waste monitoring applications is envisioned.

  2. Performance analysis of a full-field and full-range swept-source OCT system

    Science.gov (United States)

    Krauter, J.; Boettcher, T.; Körner, K.; Gronle, M.; Osten, W.; Passilly, N.; Froehly, L.; Perrin, S.; Gorecki, C.

    2015-09-01

    In recent years, optical coherence tomography (OCT) became gained importance in medical disciplines like ophthalmology, due to its noninvasive optical imaging technique with micrometer resolution and short measurement time. It enables e. g. the measurement and visualization of the depth structure of the retina. In other medical disciplines like dermatology, histopathological analysis is still the gold standard for skin cancer diagnosis. The EU-funded project VIAMOS (Vertically Integrated Array-type Mirau-based OCT System) proposes a new type of OCT system combined with micro-technologies to provide a hand-held, low-cost and miniaturized OCT system. The concept is a combination of full-field and full-range swept-source OCT (SS-OCT) detection in a multi-channel sensor based on a micro-optical Mirau-interferometer array, which is fabricated by means of wafer fabrication. This paper presents the study of an experimental proof-of-concept OCT system as a one-channel sensor with bulk optics. This sensor is a Linnik-interferometer type with similar optical parameters as the Mirau-interferometer array. A commercial wavelength tunable light source with a center wavelength at 845nm and 50nm spectral bandwidth is used with a camera for parallel OCT A-Scan detection. In addition, the reference microscope objective lens of the Linnik-interferometer is mounted on a piezo-actuated phase-shifter. Phase-shifting interferometry (PSI) techniques are applied for resolving the conjugate complex artifact and consequently contribute to an increase of image quality and depth range. A suppression ratio of the complex conjugate term of 36 dB is shown and a system sensitivity greater than 96 dB could be measured.

  3. ARCHITECTURE AND FUNCTIONALITY OF INTEGRATED INFORMATION SYSTEM FOR ANALYSIS OF POTENTIAL OF RENEWABLE ENERGY SOURCES

    Directory of Open Access Journals (Sweden)

    B. A. Tonkonogov

    2017-01-01

    Full Text Available The aim of the work was the development of the original architecture of an integrated information system for analysis of the potential of renewable energy sources. The required functionality of system has led to the solution of a number of problems in the development of appropriate software modules that implement methods, models and algorithms for assessing the energy potential and economic efficiency of the use of renewable energy sources (RES. This required the solution of the following problems: adaptation of existing and development of new methods for analyzing the potential of RES at various territorial levels using modern technologies of geographic information systems and computer technologies were accomplished; models for the assessment and calculation of the potential of renewable energy resources were developed; techniques for assessing of the economic effectiveness of decisions made for using of RES were adapted; architecture of the information system was developed and the choice of technologies and means for its implementation was made; algorithms of software modules and their interaction as a parts of the information system were developed. A distinctive feature of the architecture were flexibility and openness for the expansion and implementation of additional functionality, in particular the development of special algorithms and software modules for interacting with the database and a graphical Web-based user interface that provides the ability to work with cartographic information. The development and implementation of this system is a modern up-to-date scientific and practical task, the solution of which will create conditions for increased use of RES in RB and improving the country’s energy security. The results of conducted researches and completed developments can be used in the system of the Ministry of Natural Resources and Environmental Protection of RB, in particular for maintaining of the state cadastre of RES and making

  4. Evaluating laser-driven Bremsstrahlung radiation sources for imaging and analysis of nuclear waste packages

    International Nuclear Information System (INIS)

    Jones, Christopher P.; Brenner, Ceri M.; Stitt, Camilla A.; Armstrong, Chris; Rusby, Dean R.; Mirfayzi, Seyed R.; Wilson, Lucy A.; Alejo, Aarón; Ahmed, Hamad; Allott, Ric; Butler, Nicholas M.H.; Clarke, Robert J.; Haddock, David; Hernandez-Gomez, Cristina; Higginson, Adam; Murphy, Christopher; Notley, Margaret; Paraskevoulakos, Charilaos; Jowsey, John

    2016-01-01

    Highlights: • X-ray generation was achieved via laser interaction with a tantalum thin foil target. • Picosecond X-ray pulse from a sub-mm spot generated high resolution images. • MeV X-ray emission is possible, permitting analysis of full scale waste containers. • In parallel neutron emission of 10"7–10"9 neutrons per steradian per pulse was attained. • Development of a 10 Hz diode pumped laser system for waste monitoring is envisioned. - Abstract: A small scale sample nuclear waste package, consisting of a 28 mm diameter uranium penny encased in grout, was imaged by absorption contrast radiography using a single pulse exposure from an X-ray source driven by a high-power laser. The Vulcan laser was used to deliver a focused pulse of photons to a tantalum foil, in order to generate a bright burst of highly penetrating X-rays (with energy >500 keV), with a source size of <0.5 mm. BAS-TR and BAS-SR image plates were used for image capture, alongside a newly developed Thalium doped Caesium Iodide scintillator-based detector coupled to CCD chips. The uranium penny was clearly resolved to sub-mm accuracy over a 30 cm"2 scan area from a single shot acquisition. In addition, neutron generation was demonstrated in situ with the X-ray beam, with a single shot, thus demonstrating the potential for multi-modal criticality testing of waste materials. This feasibility study successfully demonstrated non-destructive radiography of encapsulated, high density, nuclear material. With recent developments of high-power laser systems, to 10 Hz operation, a laser-driven multi-modal beamline for waste monitoring applications is envisioned.

  5. Solving the forward problem in EEG source analysis by spherical and fdm head modeling: a comparative analysis - biomed 2009

    NARCIS (Netherlands)

    Vatta, F.; Meneghini, F.; Esposito, F.; Mininel, S.; Di Salle, F.

    2009-01-01

    Neural source localization techniques based on electroencephalography (EEG) use scalp potential data to infer the location of underlying neural activity. This procedure entails modeling the sources of EEG activity and modeling the head volume conduction process to link the modeled sources to the

  6. BSDWormer; an Open Source Implementation of a Poisson Wavelet Multiscale Analysis for Potential Fields

    Science.gov (United States)

    Horowitz, F. G.; Gaede, O.

    2014-12-01

    Wavelet multiscale edge analysis of potential fields (a.k.a. "worms") has been known since Moreau et al. (1997) and was independently derived by Hornby et al. (1999). The technique is useful for producing a scale-explicit overview of the structures beneath a gravity or magnetic survey, including establishing the location and estimating the attitude of surface features, as well as incorporating information about the geometric class (point, line, surface, volume, fractal) of the underlying sources — in a fashion much like traditional structural indices from Euler solutions albeit with better areal coverage. Hornby et al. (2002) show that worms form the locally highest concentration of horizontal edges of a given strike — which in conjunction with the results from Mallat and Zhong (1992) induces a (non-unique!) inversion where the worms are physically interpretable as lateral boundaries in a source distribution that produces a close approximation of the observed potential field. The technique has enjoyed widespread adoption and success in the Australian mineral exploration community — including "ground truth" via successfully drilling structures indicated by the worms. Unfortunately, to our knowledge, all implementations of the code to calculate the worms/multiscale edges (including Horowitz' original research code) are either part of commercial software packages, or have copyright restrictions that impede the use of the technique by the wider community. The technique is completely described mathematically in Hornby et al. (1999) along with some later publications. This enables us to re-implement from scratch the code required to calculate and visualize the worms. We are freely releasing the results under an (open source) BSD two-clause software license. A git repository is available at . We will give an overview of the technique, show code snippets using the codebase, and present visualization results for example datasets (including the Surat basin of Australia

  7. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Science.gov (United States)

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  8. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Directory of Open Access Journals (Sweden)

    Lum Karl

    2011-03-01

    countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  9. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    Science.gov (United States)

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  10. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  11. Physical performance analysis and progress of the development of the negative ion RF source for the ITER NBI system

    International Nuclear Information System (INIS)

    Fantz, U.; Franzen, P.; Kraus, W.; Berger, M.; Christ-Koch, S.; Falter, H.; Froeschle, M.; Gutser, R.; Heinemann, B.; Martens, C.; McNeely, P.; Riedl, R.; Speth, E.; Staebler, A.; Wuenderlich, D.

    2009-01-01

    For heating and current drive the neutral beam injection (NBI) system for ITER requires a 1 MeV deuterium beam for up to 1 h pulse length. In order to inject the required 17 MW the large area source (1.9 m x 0.9 m) has to deliver 40 A of negative ion current at the specified source pressure of 0.3 Pa. In 2007, the IPP RF driven negative hydrogen ion source was chosen by the ITER board as the new reference source for the ITER NBI system due to, in principle, its maintenance free operation and the progress in the RF source development. The performance analysis of the IPP RF sources is strongly supported by an extensive diagnostic program and modelling of the source and beam extraction. The control of the plasma chemistry and the processes in the plasma region near the extraction system are the most critical topics for source optimization both for long pulse operation as well as for the source homogeneity. The long pulse stability has been demonstrated at the test facility MANITU which is now operating routinely at stable pulses of up to 10 min with parameters near the ITER requirements. A quite uniform plasma illumination of a large area source (0.8 m x 0.8 m) has been demonstrated at the ion source test facility RADI. The new test facility ELISE presently planned at IPP is being designed for long pulse plasma operation and short pulse, but large-scale extraction from a half-size ITER source which is an important intermediate step towards ITER NBI.

  12. Beyond the double banana

    DEFF Research Database (Denmark)

    Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger

    2014-01-01

    PURPOSE: To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source......). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical...... performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). RESULTS: Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest...

  13. Dataset on statistical analysis of editorial board composition of Hindawi journals indexed in Emerging sources citation index

    Directory of Open Access Journals (Sweden)

    Hilary I. Okagbue

    2018-04-01

    Full Text Available This data article contains the statistical analysis of the total, percentage and distribution of editorial board composition of 111 Hindawi journals indexed in Emerging Sources Citation Index (ESCI across the continents. The reliability of the data was shown using correlation, goodness-of-fit test, analysis of variance and statistical variability tests. Keywords: Hindawi, Bibliometrics, Data analysis, ESCI, Random, Smart campus, Web of science, Ranking analytics, Statistics

  14. INLINING 3D RECONSTRUCTION, MULTI-SOURCE TEXTURE MAPPING AND SEMANTIC ANALYSIS USING OBLIQUE AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    D. Frommholz

    2016-06-01

    Full Text Available This paper proposes an in-line method for the simplified reconstruction of city buildings from nadir and oblique aerial images that at the same time are being used for multi-source texture mapping with minimal resampling. Further, the resulting unrectified texture atlases are analyzed for fac¸ade elements like windows to be reintegrated into the original 3D models. Tests on real-world data of Heligoland/ Germany comprising more than 800 buildings exposed a median positional deviation of 0.31 m at the fac¸ades compared to the cadastral map, a correctness of 67% for the detected windows and good visual quality when being rendered with GPU-based perspective correction. As part of the process building reconstruction takes the oriented input images and transforms them into dense point clouds by semi-global matching (SGM. The point sets undergo local RANSAC-based regression and topology analysis to detect adjacent planar surfaces and determine their semantics. Based on this information the roof, wall and ground surfaces found get intersected and limited in their extension to form a closed 3D building hull. For texture mapping the hull polygons are projected into each possible input bitmap to find suitable color sources regarding the coverage and resolution. Occlusions are detected by ray-casting a full-scale digital surface model (DSM of the scene and stored in pixel-precise visibility maps. These maps are used to derive overlap statistics and radiometric adjustment coefficients to be applied when the visible image parts for each building polygon are being copied into a compact texture atlas without resampling whenever possible. The atlas bitmap is passed to a commercial object-based image analysis (OBIA tool running a custom rule set to identify windows on the contained fac¸ade patches. Following multi-resolution segmentation and classification based on brightness and contrast differences potential window objects are evaluated against geometric

  15. Sensitivity analysis on the performances of a closed-loop Ground Source Heat Pump

    Science.gov (United States)

    Casasso, Alessandro; Sethi, Rajandrea

    2014-05-01

    Ground Source Heat Pumps (GSHP) permit to achieve a significant reduction of greenhouse gas emissions, and the margins for economic saving of this technology are strongly correlated to the long-term sustainability of the exploitation of the heat stored in the soil. The operation of a GSHP over its lifetime should be therefore modelled considering realistic conditions, and a thorough characterization of the physical properties of the soil is essential to avoid large errors of prediction. In this work, a BHE modelling procedure with the finite-element code FEFLOW is presented. Starting from the governing equations of the heat transport in the soil around a GSHP and inside the BHE, the most important parameters are individuated and the adopted program settings are explained. A sensitivity analysis is then carried on both the design parameters of the heat exchanger, in order to understand the margins of improvement of a careful design and installation, and the physical properties of the soil, with the aim of quantifying the uncertainty induced by their variability. The relative importance of each parameter is therefore assessed by comparing the statistical distributions of the fluid temperatures and estimating the energy consumption of the heat pump, and practical conclusions are from these results about the site characterization, the design and the installation of a BHE. References Casasso A., Sethi R., 2014 Efficiency of closed loop geothermal heat pumps: A sensitivity analysis, Renewable Energy 62 (2014), pp. 737-746 Chiasson A.C., Rees S.J., Spitler J.D., 2000, A preliminary assessment of the effects of groundwater flow on closed-loop ground-source heat pump systems, ASHRAE Transactions 106 (2000), pp. 380-393 Delaleux F., Py X., Olives R., Dominguez A., 2012, Enhancement of geothermal borehole heat exchangers performances by improvement of bentonite grouts conductivity, Applied Thermal Engineering 33-34, pp. 92-99 Diao N., Li Q., Fang Z., 2004, Heat transfer in

  16. Analysis of the Source System of Nantun Group in Huhehu Depression of Hailar Basin

    Science.gov (United States)

    Li, Yue; Li, Junhui; Wang, Qi; Lv, Bingyang; Zhang, Guannan

    2017-10-01

    Huhehu Depression will be the new battlefield in Hailar Basin in the future, while at present it’s in a low exploration level. The study about the source system of Nantun group is little, so fine depiction of the source system would be significant to sedimentary system reconstruction, the reservoir distribution and prediction of favorable area. In this paper, it comprehensive uses of many methods such as ancient landform, light and heavy mineral combination, seismic reflection characteristics, to do detailed study about the source system of Nantun group in different views and different levels. The results show that the source system in Huhehu Depression is from the east of Xilinbeir bulge and the west of Bayan Moutain uplift, which is surrounded by basin. The slope belt is the main source, and the southern bulge is the secondary source. The distribution of source system determines the distribution of sedimentary system and the regularity of the distribution of sand body.

  17. TITANIUM ISOTOPE SOURCE RELATIONS AND THE EXTENT OF MIXING IN THE PROTO-SOLAR NEBULA EXAMINED BY INDEPENDENT COMPONENT ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Steele, Robert C. J.; Boehnke, Patrick [Department of Earth, Planetary, and Space Sciences, University of California, Los Angeles, CA 90095 (United States)

    2015-04-01

    The Ti isotope variations observed in hibonites represent some of the largest isotope anomalies observed in the solar system. Titanium isotope compositions have previously been reported for a wide variety of different early solar system materials, including calcium, aluminum rich inclusions (CAIs) and CM hibonite grains, some of the earliest materials to form in the solar system, and bulk meteorites which formed later. These data have the potential to allow mixing of material to be traced between many different regions of the early solar system. We have used independent component analysis to examine the mixing end-members required to produce the compositions observed in the different data sets. The independent component analysis yields results identical to a linear regression for the bulk meteorites. The components identified for hibonite suggest that most of the grains are consistent with binary mixing from one of three highly anomalous nucleosynthetic sources. Comparison of these end-members show that the sources which dominate the variation of compositions in the meteorite parent body forming regions was not present in the region in which the hibonites formed. This suggests that the source which dominates variation in Ti isotope anomalies between the bulk meteorites was not present when the hibonite grains were forming. One explanation is that the bulk meteorite source may not be a primary nucleosynthetic source but was created by mixing two or more of the hibonite sources. Alternatively, the hibonite sources may have been diluted during subsequent nebula processing and are not a dominant solar system signatures.

  18. A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.

    Science.gov (United States)

    Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram

    2017-11-01

    Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan

  19. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    Science.gov (United States)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  20. LESTO: an Open Source GIS-based toolbox for LiDAR analysis

    Science.gov (United States)

    Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino

    2015-04-01

    During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed

  1. Characteristics and sources analysis of riverine chromophoric dissolved organic matter in Liaohe River, China.

    Science.gov (United States)

    Shao, Tiantian; Song, Kaishan; Jacinthe, Pierre-Andre; Du, Jia; Zhao, Ying; Ding, Zhi; Guan, Ying; Bai, Zhang

    2016-12-01

    Chromophoric dissolved organic matter (CDOM) in riverine systems can be affected by environmental conditions and land-use, and thus could provide important information regarding human activities in surrounding landscapes. The optical properties of water samples collected at 42 locations across the Liaohe River (LHR, China) watershed were examined using UV-Vis and fluorescence spectroscopy to determine CDOM characteristics, composition and sources. Total nitrogen (TN) and total phosphorus (TP) concentrations at all sampling sites exceeded the GB3838-2002 (national quality standards for surface waters, China) standard for Class V waters of 2.0 mg N/L and 0.4 mg P/L respectively, while trophic state index (TSI M ) indicated that all the sites investigated were mesotrophic, 64% of which were eutrophic at the same time. Redundancy analysis showed that total suspended matter (TSM), dissolved organic carbon (DOC), and turbidity had a strong correlation with CDOM, while the other parameters (Chl a, TN, TP and TSI M ) exhibited weak correlations with CDOM absorption. High spectral slope values and low SUVA254 (the specific UV absorption) values indicated that CDOM in the LHR was primarily comprised of low molecular weight organic substances. Analysis of excitation-emission matrices contour plots showed that CDOM in water samples collected from upstream locations exhibited fulvic-acid-like characteristics whereas protein-like substances were most likely predominant in samples collected in estuarine areas and downstream from large cities. These patterns were interpreted as indicative of water pollution from urban and industrial activities in several downstream sections of the LHR watershed.

  2. Exergoeconomic analysis of a solar assisted ground-source heat pump greenhouse heating system

    International Nuclear Information System (INIS)

    Ozgener, Onder; Hepbasli, Arif

    2005-01-01

    EXCEM analysis may prove useful to investigators in engineering and other disciplines due to the methodology are being based on the quantities exergy, cost, energy and mass. The main objective of the present study is to investigate between capital costs and thermodynamic losses for devices in solar assisted ground-source heat pump greenhouse heating system (SAGSHPGHS) with a 50 m vertical 32 mm nominal diameter U-bend ground heat exchanger. This system was designed and installed at the Solar Energy Institute, Ege University, Izmir, Turkey. Thermodynamic loss rate-to-capital cost ratios are used to show that, for components and the overall system, a systematic correlation appears to exist between capital cost and exergy loss (total or internal), but not between capital cost and energy loss or external exergy loss. This correlation may imply that devices in successful air conditioning are configured so as to achieve an overall optimal design, by appropriately balancing the thermodynamic (exergy-based) and economic characteristics of the overall system and its devices. The results may, (i) provide useful insights into the relations between thermodynamics and economics, both in general and for SAGSHPGHS (ii) help demonstrate the merits of second-law analysis. It is observed from the results that the maximum exergy destructions in the system particularly occur due to the electrical, mechanical and isentropic efficiencies and emphasize the need for paying close attention to the selection of this type of equipment, since components of inferior performance can considerably reduce the overall performance of the system. In conjunction with this, the total exergy losses values are obtained to be from 0.010 kW to 0.480 kW for the system. As expected, the largest energy and exergy losses occur in the greenhouse and compressor. The ratio of thermodynamic loss rate to capital cost values are obtained for a range from 0.035 to 1.125

  3. Open source platform for collaborative construction of wearable sensor datasets for human motion analysis and an application for gait analysis.

    Science.gov (United States)

    Llamas, César; González, Manuel A; Hernández, Carmen; Vegas, Jesús

    2016-10-01

    Nearly every practical improvement in modeling human motion is well founded in a properly designed collection of data or datasets. These datasets must be made publicly available for the community could validate and accept them. It is reasonable to concede that a collective, guided enterprise could serve to devise solid and substantial datasets, as a result of a collaborative effort, in the same sense as the open software community does. In this way datasets could be complemented, extended and expanded in size with, for example, more individuals, samples and human actions. For this to be possible some commitments must be made by the collaborators, being one of them sharing the same data acquisition platform. In this paper, we offer an affordable open source hardware and software platform based on inertial wearable sensors in a way that several groups could cooperate in the construction of datasets through common software suitable for collaboration. Some experimental results about the throughput of the overall system are reported showing the feasibility of acquiring data from up to 6 sensors with a sampling frequency no less than 118Hz. Also, a proof-of-concept dataset is provided comprising sampled data from 12 subjects suitable for gait analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  5. Preliminary design and off-design performance analysis of an Organic Rankine Cycle for geothermal sources

    International Nuclear Information System (INIS)

    Hu, Dongshuai; Li, Saili; Zheng, Ya; Wang, Jiangfeng; Dai, Yiping

    2015-01-01

    Highlights: • A method for preliminary design and performance prediction is established. • Preliminary data of radial inflow turbine and plate heat exchanger are obtained. • Off-design performance curves of critical components are researched. • Performance maps in sliding pressure operation are illustrated. - Abstract: Geothermal fluid of 90 °C and 10 kg/s can be exploited together with oil in Huabei Oilfield of China. Organic Rankine Cycle is regarded as a reasonable method to utilize these geothermal sources. This study conducts a detailed design and off-design performance analysis based on the preliminary design of turbines and heat exchangers. The radial inflow turbine and plate heat exchanger are selected in this paper. Sliding pressure operation is applied in the simulation and three parameters are considered: geothermal fluid mass flow rate, geothermal fluid temperature and condensing pressure. The results indicate that in all considered conditions the designed radial inflow turbine has smooth off-design performance and no choke or supersonic flow are found at the nozzle and rotor exit. The lager geothermal fluid mass flow rate, the higher geothermal fluid temperature and the lower condensing pressure contribute to the increase of cycle efficiency and net power. Performance maps are illustrated to make system meet different load requirements especially when the geothermal fluid temperature and condensing pressure deviate from the design condition. This model can be used to provide basic data for future detailed design, and predict off-design performance in the initial design phase

  6. Crystallization Analysis and Control of Ammonia-Based Air Source Absorption Heat Pump in Cold Regions

    Directory of Open Access Journals (Sweden)

    Wei Wu

    2013-01-01

    Full Text Available Energy consumption of heating and domestic hot water is very high and will keep increasing. Air source absorption heat pump (ASAHP was proposed to overcome the problems of low energy efficiency and high air pollution existing in boiler systems, as well as the problem of bad performance under low ambient temperatures for electrical heat pumps. In order to investigate the crystallization possibility of ammonia-salt ASAHP, crystallization margin (evaluated by solution mass concentration at generating temperature ranging from 100 to 150°C, evaporating temperature from −30 to 10°C, and condensing temperature from 30 to 65°C are analyzed. To prevent the NH3–NaSCN solution from crystallizing, ASAHP integrated with pressure booster located between the evaporator and absorber is simulated. Analysis and comparisons show that NH3–NaSCN is easy to crystallize at relatively high generating temperature, low evaporating temperature, and low condensing temperature. But crystallization margin of NH3–LiNO3 can always stay above 5% for most conditions, keeping away from crystallization. Pressure booster can effectively avoid the crystallization problem that will take place in the NH3–NaSCN ASAHP system.

  7. Laser-plasma sourced, temperature dependent, VUV spectrophotometer using dispersive analysis

    International Nuclear Information System (INIS)

    French, R.H.

    1990-01-01

    We have developed a vacuum ultraviolet spectrophotometer with wide energy and temperature range coverage, utilizing a laser-plasma light source (LPLS), CO 2 -laser sample heating and time-resolved dispersive analysis. Reflection and transmission spectra can be taken from 1.7 to 40 eV (31-700 nm) on samples at 15-1800 K with a time resolution of 20-400 ns. These capabilities permit the study of the temperature dependence of the electronic structure, encompassing the effects of thermal lattice expansion and electron-phonon interaction, and changes in the electronic structure associated with equilibrium and metastable phase transitions and stress relaxation. The LPLS utilizes a samarium laser-plasma created by a Q-switched Nd:YAG laser (500 mJ/pulse) to produce high brightness, stable, continuum radiation. The spectrophotometer is of a single beam design using calibrated iridium reference mirrors. White light is imaged off the sample in to the entrance slit of a 1-m polychromator. The resolution is 0.1 to 0.3 nm. The dispersed light is incident on a focal plane phosphor, fiber-optic-coupled to an image-intensified reticon detector. For spectroscopy between 300 and 1800 K, the samples are heated in situ with a 150 Watt CO 2 laser. The signal to noise ratio in the VUV, for samples at 1800 K, is excellent. From 300 K to 15 K samples are cooled using a He cryostat. (orig.)

  8. ISAMBARD: an open-source computational environment for biomolecular analysis, modelling and design.

    Science.gov (United States)

    Wood, Christopher W; Heal, Jack W; Thomson, Andrew R; Bartlett, Gail J; Ibarra, Amaurys Á; Brady, R Leo; Sessions, Richard B; Woolfson, Derek N

    2017-10-01

    The rational design of biomolecules is becoming a reality. However, further computational tools are needed to facilitate and accelerate this, and to make it accessible to more users. Here we introduce ISAMBARD, a tool for structural analysis, model building and rational design of biomolecules. ISAMBARD is open-source, modular, computationally scalable and intuitive to use. These features allow non-experts to explore biomolecular design in silico. ISAMBARD addresses a standing issue in protein design, namely, how to introduce backbone variability in a controlled manner. This is achieved through the generalization of tools for parametric modelling, describing the overall shape of proteins geometrically, and without input from experimentally determined structures. This will allow backbone conformations for entire folds and assemblies not observed in nature to be generated de novo, that is, to access the 'dark matter of protein-fold space'. We anticipate that ISAMBARD will find broad applications in biomolecular design, biotechnology and synthetic biology. A current stable build can be downloaded from the python package index (https://pypi.python.org/pypi/isambard/) with development builds available on GitHub (https://github.com/woolfson-group/) along with documentation, tutorial material and all the scripts used to generate the data described in this paper. d.n.woolfson@bristol.ac.uk or chris.wood@bristol.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  9. Source Term Analysis of the Irradiated Graphite in the Core of HTR-10

    Directory of Open Access Journals (Sweden)

    Xuegang Liu

    2017-01-01

    Full Text Available The high temperature gas-cooled reactor (HTGR has potential utilization due to its featured characteristics such as inherent safety and wide diversity of utilization. One distinct difference between HTGR and traditional pressurized water reactor (PWR is the large inventory of graphite in the core acting as reflector, moderator, or structure materials. Some radionuclides will be generated in graphite during the period of irradiation, which play significant roles in reactor safety, environmental release, waste disposal, and so forth. Based on the actual operation of the 10 MW pebble bed high temperature gas-cooled reactor (HTR-10 in Tsinghua University, China, an experimental study on source term analysis of the irradiated graphite has been done. An irradiated graphite sphere was randomly collected from the core of HTR-10 as sample in this study. This paper focuses on the analytical procedure and the establishment of the analytical methodology, including the sample collection, graphite sample preparation, and analytical parameters. The results reveal that the Co-60, Cs-137, Eu-152, and Eu-154 are the major γ contributors, while H-3 and C-14 are the dominating β emitting nuclides in postirradiation graphite material of HTR-10. The distribution profiles of the above four nuclides are also presented.

  10. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    International Nuclear Information System (INIS)

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ''MELCOR Verification, Benchmarking, and Applications,'' whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR

  11. Risk analysis of NPP in multi-unit site for configuration of AAC power source

    International Nuclear Information System (INIS)

    Kim, Myung Ki

    2000-01-01

    Because of the difficulties in finding new sites for nuclear power plants, more units are being added to the existing sites. In these multi-unit sites, appropriate countermeasures should be established to cope with the potential station blackout (SBO) accident. Currently, installation of additional diesel generator (DG) is considered to ensure an alternative AC power source, but it has not been decided yet how many DGs should be installed in a multi-unit site. In this paper, risk informed decision making method, which evaluates reliability of electrical system, core damage frequency, and site average core damage frequency, is introduced to draw up the suitable number of DG in multi-unit site. The analysis results show that installing two DGs lowered the site average core damage frequency by 1.4% compared to one DG in six unit site. In the light of risk-informed decisions in regulatory guide 1.174, there is no difference of safety between two alternatives. It is concluded that one emergency diesel generator sufficiently guarantees safety against station blackout of nuclear power plants in multi-unit site. (author)

  12. Natural circulation analysis for the advanced neutron source reactor refueling process 11

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, R.F.; Dasardhi, S.; Elkassabgi, Y. [Texas A& M Univ., Kingsville, TX (United States); Yoder, G.L. [Oak Ridge National Lab., TN (United States)

    1995-09-01

    During the refueling process of the Advanced Neutron Source Reactor (ANSR), the spent fuel elements must be moved from the primary coolant loop (containing D{sub 2}O), through a heavy water pool, and finally into a light water spent fuel storage area. The present refueling scheme utilizes remote refueling equipment to move the spent fuel elements through a D{sub 2}O filled stack and tunnel into a temporary storage canal. A transfer lock is used to move the spent fuel elements from the D{sub 2}O-filled interim storage canal to a light water pool. Each spent fuel element must be cooled during this process, using either natural circulation or forced convection. This paper presents a summary of the numerical techniques used to analyze natural circulation cooling of the ANSR fuel elements as well as selected results of the calculations. Details of the analysis indicate that coolant velocities below 10 cm/s exist in the coolant channels under single phase natural circulation conditions. Also, boiling does not occur within the channels if power levels are below a few hundred kW when the core transitions to natural circulation conditions.

  13. Exploring sources of biogenic secondary organic aerosol compounds using chemical analysis and the FLEXPART model

    Directory of Open Access Journals (Sweden)

    J. Martinsson

    2017-09-01

    Full Text Available Molecular tracers in secondary organic aerosols (SOAs can provide information on origin of SOA, as well as regional scale processes involved in their formation. In this study 9 carboxylic acids, 11 organosulfates (OSs and 2 nitrooxy organosulfates (NOSs were determined in daily aerosol particle filter samples from Vavihill measurement station in southern Sweden during June and July 2012. Several of the observed compounds are photo-oxidation products from biogenic volatile organic compounds (BVOCs. Highest average mass concentrations were observed for carboxylic acids derived from fatty acids and monoterpenes (12. 3 ± 15. 6 and 13. 8 ± 11. 6 ng m−3, respectively. The FLEXPART model was used to link nine specific surface types to single measured compounds. It was found that the surface category sea and ocean was dominating the air mass exposure (56 % but contributed to low mass concentration of observed chemical compounds. A principal component (PC analysis identified four components, where the one with highest explanatory power (49 % displayed clear impact of coniferous forest on measured mass concentration of a majority of the compounds. The three remaining PCs were more difficult to interpret, although azelaic, suberic, and pimelic acid were closely related to each other but not to any clear surface category. Hence, future studies should aim to deduce the biogenic sources and surface category of these compounds. This study bridges micro-level chemical speciation to air mass surface exposure at the macro level.

  14. Occurrence, fluxes and sources of perfluoroalkyl substances with isomer analysis in the snow of northern China.

    Science.gov (United States)

    Shan, Guoqiang; Chen, Xinwei; Zhu, Lingyan

    2015-12-15

    In this study, perfluoroalkyl substances (PFASs) and the isomers of perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) were analyzed in fresh snow samples collected from 19 cities in northern China, 2013. The levels of total PFASs in the snow samples were 33.5-229 ng/L, suggesting heavy atmospheric pollution of PFASs in northern China. PFOA (9.08-107 ng/L), PFOS (3.52-54.3 ng/L), perfluoroheptanoate (PFHpA) (3.66-44.8 ng/L), and perfluorohexanoate (PFHxA) (3.21-23.6 ng/L) were predominant with a summed contribution of 82% to the total PFASs. The particulate matters (PMs) associated PFASs contributed 21.5-56.2% to the total PFASs in the snow, suggesting PMs are vital for the transport and deposition of airborne PFASs. Partitioning of PFASs between PM and dissolved phases was dependent on the carbon chain length and end functional groups. Isomer profiles of PFOA and PFOS in the snow were in agreement with the signature of the historical 3M electrochemical fluorination (ECF) products, suggesting that the ECF products were still produced and used in China. Further source analysis showed that the airborne PFASs in ur