WorldWideScience

Sample records for source montage analysis

  1. Analysis of infant cortical synchrony is constrained by the number of recording electrodes and the recording montage.

    Science.gov (United States)

    Tokariev, Anton; Vanhatalo, Sampsa; Palva, J Matias

    2016-01-01

    To assess how the recording montage in the neonatal EEG influences the detection of cortical source signals and their phase interactions. Scalp EEG was simulated by forward modeling 20-200 simultaneously active sources covering the cortical surface of a realistic neonatal head model. We assessed systematically how the number of scalp electrodes (11-85), analysis montage, or the size of cortical sources affect the detection of cortical phase synchrony. Statistical metrics were developed for quantifying the resolution and reliability of the montages. The findings converge to show that an increase in the number of recording electrodes leads to a systematic improvement in the detection of true cortical phase synchrony. While there is always a ceiling effect with respect to discernible cortical details, we show that the average and Laplacian montages exhibit superior specificity and sensitivity as compared to other conventional montages. Reliability in assessing true neonatal cortical synchrony is directly related to the choice of EEG recording and analysis configurations. Because of the high conductivity of the neonatal skull, the conventional neonatal EEG recordings are spatially far too sparse for pertinent studies, and this loss of information cannot be recovered by re-montaging during analysis. Future neonatal EEG studies will need prospective planning of recording configuration to allow analysis of spatial details required by each study question. Our findings also advice about the level of details in brain synchrony that can be studied with existing datasets or by using conventional EEG recordings. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. 32-Channel banana-avg montage is better than 16-channel double banana montage to detect epileptiform discharges in routine EEGs.

    Science.gov (United States)

    Ochoa, Juan; Gonzalez, Walter; Bautista, Ramon; DeCerce, John

    2008-10-01

    We designed a study, comparing the yield of standard 16-channel longitudinal bipolar montage (double banana) versus a combined 32-channel longitudinal bipolar plus average referential montage (banana-plus), to detect epileptic abnormalities. We selected 25 consecutive routine EEG samples with a diagnosis of spike or sharp waves in the temporal regions and 25 consecutive focal slowing and 50 normal EEGs. A total of 100 samples were printed in both montages and randomized for reading. Thirty independent EEG readers blinded from the EEG diagnosis were invited to participate. Twenty-two readers successfully completed the test for a total of 4400 answers collected for analysis. The average sensitivity to detect epileptiform discharges for 16 and 32-channel montages was 36.5% and 61%, respectively (Pdouble banana montage. Residents and EEG fellows could improve EEG-reading accuracy if taught on a combined 32-channel montage.

  3. Design as Montage

    DEFF Research Database (Denmark)

    Smith, Rachel Charlotte; Kjærsgaard, Mette

    The paper explores the role of video and (visual) anthropology in design and product development processes from creating insights about users to developing design ideas in interdisciplinary teams. In the paper we suggest montage as a metaphor for understanding how meaning and ideas are created...... in design processes. Rather than viewing montage as a particular style of filmmaking, we see the design process itself as a montage, a process where different heterogeneous materials – sketches, video, prototypes, etc. – as well as different professional and disciplinary perspectives are brought together...

  4. TOASTing Your Images With Montage

    Science.gov (United States)

    Berriman, G. Bruce; Good, John

    2017-01-01

    The Montage image mosaic engine is a scalable toolkit for creating science-grade mosaics of FITS files, according to the user's specifications of coordinates, projection, sampling, and image rotation. It is written in ANSI-C and runs on all common *nix-based platforms. The code is freely available and is released with a BSD 3-clause license. Version 5 is a major upgrade to Montage, and provides support for creating images that can be consumed by the World Wide Telescope (WWT). Montage treats the TOAST sky tessellation scheme, used by the WWT, as a spherical projection like those in the WCStools library. Thus images in any projection can be converted to the TOAST projection by Montage’s reprojection services. These reprojections can be performed at scale on high-performance platforms and on desktops. WWT consumes PNG or JPEG files, organized according to WWT’s tiling and naming scheme. Montage therefore provides a set of dedicated modules to create the required files from FITS images that contain the TOAST projection. There are two other major features of Version 5. It supports processing of HEALPix files to any projection in the WCS tools library. And it can be built as a library that can be called from other languages, primarily Python. http://montage.ipac.caltech.edu.GitHub download page: https://github.com/Caltech-IPAC/Montage.ASCL record: ascl:1010.036. DOI: dx.doi.org/10.5281/zenodo.49418 Montage is funded by the National Science Foundation under Grant Number ACI-1440620,

  5. Montage and Image as Paradigm

    Directory of Open Access Journals (Sweden)

    Cesar Huapaya

    2016-01-01

    Full Text Available Thought as montage and image has become a revealing method in the practical and theoretical study processes of artists and researchers of the 20th and 21st centuries. This article aims to articulate three ways of thinking through montage in the works of Bertolt Brecht, Sergei Eisenstein e Georges Didi-Huberman. The philosopher and art historian Georges Didi-Huberman re-inaugurates the debate and exercise of thinking the anthropology of image and montage as a metalanguage and a form of knowledge.

  6. SEP Montage Variability Comparison during Intraoperative Neurophysiologic Monitoring.

    Science.gov (United States)

    Hanson, Christine; Lolis, Athena Maria; Beric, Aleksandar

    2016-01-01

    Intraoperative monitoring is performed to provide real-time assessment of the neural structures that can be at risk during spinal surgery. Somatosensory evoked potentials (SEPs) are the most commonly used modality for intraoperative monitoring. SEP stability can be affected by many factors during the surgery. This study is a prospective review of SEP recordings obtained during intraoperative monitoring of instrumented spinal surgeries that were performed for chronic underlying neurologic and neuromuscular conditions, such as scoliosis, myelopathy, and spinal stenosis. We analyzed multiple montages at the baseline, and then followed their development throughout the procedure. Our intention was to examine the stability of the SEP recordings throughout the surgical procedure on multiple montages of cortical SEP recordings, with the goal of identifying the appropriate combination of the least number of montages that gives the highest yield of monitorable surgeries. Our study shows that it is necessary to have multiple montages for SEP recordings, as it reduces the number of non-monitorable cases, improves IOM reliability, and therefore could reduce false positives warnings to the surgeons. Out of all the typical montages available for use, our study has shown that the recording montage Cz-C4/Cz-C3 (Cz-Cc) is the most reliable and stable throughout the procedure and should be the preferred montage followed throughout the surgery.

  7. Illness, everyday life and narrative montage

    DEFF Research Database (Denmark)

    Henriksen, Nina; Tjørnhøj-Thomsen, Tine; Hansen, Helle Ploug

    2011-01-01

    -created by the reader. It points to the effect of the aesthetics of disguise and carnival implicit in the visual-verbal montage and argues that these generate a third meaning. This meaning is associated with the breast cancer experience but is not directly discernible in the montage. The article concludes by discussing...

  8. The Montage Image Mosaic Toolkit As A Visualization Engine.

    Science.gov (United States)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS

  9. Montage Version 3.0

    Science.gov (United States)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  10. Model and Montage

    DEFF Research Database (Denmark)

    Meldgaard, Morten

    2012-01-01

    Bidraget søger at sammenkæde forskellige modelpraksis med montage teoriens tænkning. De anvendte cases er dels lavet på laserskærer og dels udført i skala 1:1, begge af studerende ved Kunstakademiets Arkitektskole. Dette empiriske materiale møder en mere teoretisk funderet reflektion, i artiklen ...... som diskuterer hvad en model er, og hvilket forhold der er mellem en analog og en digital praksis....

  11. Use of Computational Modeling to Inform tDCS Electrode Montages for the Promotion of Language Recovery in Post-stroke Aphasia.

    Science.gov (United States)

    Galletta, Elizabeth E; Cancelli, Andrea; Cottone, Carlo; Simonelli, Ilaria; Tecchio, Franca; Bikson, Marom; Marangolo, Paola

    2015-01-01

    Although pilot trials of transcranial direct current stimulation (tDCS) in aphasia are encouraging, protocol optimization is needed. Notably, it has not yet been clarified which of the varied electrode montages investigated is the most effective in enhancing language recovery. To consider and contrast the predicted brain current flow patterns (electric field distribution) produced by varied 1×1 tDCS (1 anode, 1 cathode, 5 × 7 cm pad electrodes) montages used in aphasia clinical trials. A finite element model of the head of a single left frontal stroke patient was developed in order to study the pattern of the cortical EF magnitude and inward/outward radial EF under five different electrode montages: Anodal-tDCS (A-tDCS) over the left Wernicke's area (Montage A) and over the left Broca's area (Montage B); Cathodal tDCS (C-tDCS) over the right homologue of Wernicke's area (Montage C), and of Broca's area (Montage D), where for all montages A-D the "return" electrode was placed over the supraorbital contralateral forehead; bilateral stimulation with A-tDCS over the left Broca's and CtDCS over the right Broca's homologue (Montage E). In all cases, the "return" electrode over the contralesional supraorbital forehead was not inert and influenced the current path through the entire brain. Montage B, although similar to montage D in focusing the current in the perilesional area, exerted the greatest effect over the left perilesional cortex, which was even stronger in montage E. The position and influence of both electrodes must be considered in the design and interpretation of tDCS clinical trials for aphasia. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Evaluation of a Modified High-Definition Electrode Montage for Transcranial Alternating Current Stimulation (tACS) of Pre-Central Areas

    DEFF Research Database (Denmark)

    Heise, Kirstin Friederike; Kortzorg, Nick; Saturnino, Guilherme Bicalho

    2016-01-01

    Objective: To evaluate a modified electrode montage with respect to its effect on tACS-dependent modulation of corticospinal excitability and discomfort caused by neurosensory side effects accompanying stimulation. Methods: In a double-blind cross-over design, the classical electrode montage for ....... Conclusions: In comparison to the classic montage, the M1 centre-ring montage enables a more focal stimulation of the target area and, at the same time, significantly reduces neurosensory side effects, essential for placebo-controlled study designs.......Objective: To evaluate a modified electrode montage with respect to its effect on tACS-dependent modulation of corticospinal excitability and discomfort caused by neurosensory side effects accompanying stimulation. Methods: In a double-blind cross-over design, the classical electrode montage...... for primary motor cortex (M1) stimulation (two patch electrodes over M1 and contralateral supraorbital area) was compared with an M1 centre-ring montage. Corticospinal excitability was evaluated before, during, immediately after and 15 minutes after tACS (10 min., 20 Hz vs. 30 s low-frequency transcranial...

  13. Transcranial direct current stimulation in obsessive-compulsive disorder: emerging clinical evidence and considerations for optimal montage of electrodes.

    Science.gov (United States)

    Senço, Natasha M; Huang, Yu; D'Urso, Giordano; Parra, Lucas C; Bikson, Marom; Mantovani, Antonio; Shavitt, Roseli G; Hoexter, Marcelo Q; Miguel, Eurípedes C; Brunoni, André R

    2015-07-01

    Neuromodulation techniques for obsessive-compulsive disorder (OCD) treatment have expanded with greater understanding of the brain circuits involved. Transcranial direct current stimulation (tDCS) might be a potential new treatment for OCD, although the optimal montage is unclear. To perform a systematic review on meta-analyses of repetitive transcranianal magnetic stimulation (rTMS) and deep brain stimulation (DBS) trials for OCD, aiming to identify brain stimulation targets for future tDCS trials and to support the empirical evidence with computer head modeling analysis. Systematic reviews of rTMS and DBS trials on OCD in Pubmed/MEDLINE were searched. For the tDCS computational analysis, we employed head models with the goal of optimally targeting current delivery to structures of interest. Only three references matched our eligibility criteria. We simulated four different electrodes montages and analyzed current direction and intensity. Although DBS, rTMS and tDCS are not directly comparable and our theoretical model, based on DBS and rTMS targets, needs empirical validation, we found that the tDCS montage with the cathode over the pre-supplementary motor area and extra-cephalic anode seems to activate most of the areas related to OCD.

  14. Montage: Improvising in the Land of Action Research

    Science.gov (United States)

    Windle, Sheila; Sefton, Terry

    2011-01-01

    This paper and its appended multi-media production describe the rationale and process of creating and presenting a "digitally saturated" (Lankshear & Knobel, 2003), multi-layered, synchronous "montage" (Denzin & Lincoln, 2003) of educational Action Research findings. The authors contend that this type of presentation, arising from the fusion of…

  15. The Next Generation of the Montage Image Mopsaic Engine

    Science.gov (United States)

    Berriman, G. Bruce; Good, John; Rusholme, Ben; Robitaille, Thomas

    2016-01-01

    We have released a major upgrade of the Montage image mosaic engine (http://montage.ipac.caltech.edu) , as part of a program to develop the next generation of the engine in response to the rapid changes in the data processing landscape in Astronomy, which is generating ever larger data sets in ever more complex formats . The new release (version 4) contains modules dedicated to creating and managing mosaics of data stored as multi-dimensional arrays ("data cubes"). The new release inherits the architectural benefits of portability and scalability of the original design. The code is publicly available on Git Hub and the Montage web page. The release includes a command line tool that supports visualization of large images, and the beta-release of a Python interface to the visualization tool. We will provide examples on how to use these these features. We are generating a mosaic of the Galactic Arecibo L-band Feed Array HI (GALFA-HI) Survey maps of neutral hydrogen in and around our Milky Way Galaxy, to assess the performance at scale and to develop tools and methodologies that will enable scientists inexpert in cloud processing to exploit could platforms for data processing and product generation at scale. Future releases include support for an R-tree based mechanism for fast discovery of and access to large data sets and on-demand access to calibrated SDSS DR9 data that exploits it; support for the Hierarchical Equal Area isoLatitude Pixelization (HEALPix) scheme, now standard for projects investigating cosmic background radiation (Gorski et al 2005); support fort the Tessellated Octahedral Adaptive Subdivision Transform (TOAST), the sky partitioning sky used by the WorldWide Telescope (WWT); and a public applications programming interface (API) in C that can be called from other languages, especially Python.

  16. Model-based analysis and optimization of the mapping of cortical sources in the spontaneous scalp EEG

    NARCIS (Netherlands)

    Sazonov, A.; Bergmans, J.W.M.; Cluitmans, P.J.M.; Griep, P.A.M.; Arends, J.B.A.M.; Boon, P.A.J.M.

    2007-01-01

    The mapping of brain sources into the scalp electroencephalogram (EEG) depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM) is fully determined by an observation function (OF) matrix. This paper analyses the

  17. Open ends: an ethnographic radio montage about post-diagnosis lives in Denmark and South Africa

    DEFF Research Database (Denmark)

    Houmøller, Kathrin; Steno, Anne Mia

    This presentation takes the form of a radio montage and presents stories of post-diagnosis lives in Denmark and urban South Africa. Based on ethnographic fieldworks with young people in psychiatric treatment (Denmark) and among hiv-positive people in anti-retroviral therapy (South Africa), the mo......This presentation takes the form of a radio montage and presents stories of post-diagnosis lives in Denmark and urban South Africa. Based on ethnographic fieldworks with young people in psychiatric treatment (Denmark) and among hiv-positive people in anti-retroviral therapy (South Africa...

  18. Radial Peripapillary Capillary Network Visualized Using Wide-Field Montage Optical Coherence Tomography Angiography.

    Science.gov (United States)

    Mase, Tomoko; Ishibazawa, Akihiro; Nagaoka, Taiji; Yokota, Harumasa; Yoshida, Akitoshi

    2016-07-01

    We quantitatively analyzed the features of a radial peripapillary capillary (RPC) network visualized using wide-field montage optical coherence tomography (OCT) angiography in healthy human eyes. Twenty eyes of 20 healthy subjects were recruited. En face 3 × 3-mm OCT angiograms of multiple locations in the posterior pole were acquired using the RTVue XR Avanti, and wide-field montage images of the RPC were created. To evaluate the RPC density, the montage images were binarized and skeletonized. The correlation between the RPC density and the retinal nerve fiber layer (RNFL) thickness measured by an OCT circle scan was investigated. The RPC at the temporal retina was detected as far as 7.6 ± 0.7 mm from the edge of the optic disc but not around the perifoveal area within 0.9 ± 0.1 mm of the fovea. Capillary-free zones beside the first branches of the arterioles were significantly (P optic disc edge were 13.6 ± 0.8, 11.9 ± 0.9, and 10.4 ± 0.9 mm-1. The RPC density also was correlated significantly (r = 0.64, P network. The RPC is present in the superficial peripapillary retina in proportion to the RNFL thickness, supporting the idea that the RPC may be the vascular network primarily responsible for RNFL nourishment.

  19. Mystery Montage: A Holistic, Visual, and Kinesthetic Process for Expanding Horizons and Revealing the Core of a Teaching Philosophy

    Science.gov (United States)

    Ennis, Kim; Priebe, Carly; Sharipova, Mayya; West, Kim

    2012-01-01

    Revealing the core of a teaching philosophy is the key to a concise and meaningful philosophy statement, but it can be an elusive goal. This paper offers a visual, kinesthetic, and holistic process for expanding the horizons of self-reflection, self-analysis, and self-knowledge. Mystery montage, a variation of visual mapping, storyboarding, and…

  20. Naked, Deformed, Violated Body. A Montage in the Histoire(s du cinema of Jean-Luc Godard

    Directory of Open Access Journals (Sweden)

    Alberto Brodesco

    2013-07-01

    Full Text Available The article analyses Histoire(s du cinéma (1988-1998, a cinematic essay by Jean-Luc Godard, and in particular it focuses on the controversial montage in which the French director aligns extracts from a pornographic film, Tod Browning’s Freaks, and footage from the concentration camps. With this sequence Godard inquires his own theory of montage: the idea of a productive reconciliation between opposing realities. This shocking sequence (the violence of images is compared to a similar shock (the violence of asking to witness produced by a scene of the documentary Shoah by Claude Lanzmann. The trauma of Godard’s editing choice induces the viewer to examine the issues of the degradation of the indexical status of the film, the limits of representation and the ethics of the gaze.

  1. Pharmaceutical structure montages as catalysts for design and discovery.

    Science.gov (United States)

    Njarðarson, Jon T

    2012-05-01

    Majority of pharmaceuticals are small molecule organic compounds. Their structures are most effectively described and communicated using the graphical language of organic chemistry. A few years ago we decided to harness this powerful language to create new educational tools that could serve well for data mining and as catalysts for discovery. The results were the Top 200 drug posters, which we have posted online for everyone to enjoy and update yearly. This article details the origin and motivation for our design and highlights the value of this graphical format by presenting and analyzing a new pharmaceutical structure montage (poster) focused on US FDA approved drugs in 2011.

  2. Spatial Montage and Multimedia Ethnography: Using Computers to Visualise Aspects of Migration and Social Division Among a Displaced Community

    Directory of Open Access Journals (Sweden)

    Judith Aston

    2010-05-01

    Full Text Available This paper discusses how computer-based techniques of spatial montage can be used to visualise aspects of migration and social division among a displaced community. It is based on an ongoing collaboration between the author and the anthropologist, Wendy JAMES. The work is based on a substantial archive of ethnographic photographs, audio, cine and video recordings collected by JAMES in the Sudan/Ethiopian borderlands over four decades. Initially recording the way of life of several minority peoples, she was later able to follow their fortunes during the repeated war displacements and separations they suffered from the 1980s onwards. The recordings document work rhythms, dance, song and storytelling, music and other sensory rich performances alongside spoken memories of past events. The research is developing spatial montage techniques to draw comparisons across time, between multiple points of view, and between recordings of events and spoken memories of these events. It is argued that these techniques can be used to facilitate direct engagement with ethnographic recordings, creating multimedia experiences which can flexibly integrate fieldwork data into academic discourse. In so doing it is proposed that these techniques offer new tools to enhance the analysis and understanding of issues relating to migration and social division. URN: urn:nbn:de:0114-fqs1002361

  3. Discours en circulation et (démontage filmique dans Fahrenheit 9/11 [Circulating discourse and film editing in Fahrenheit 9/11

    Directory of Open Access Journals (Sweden)

    Andrea Landvogt

    2010-12-01

    Full Text Available Le film documentaire est une pratique sociale discursive médiatique qui se fonde essentiellement sur des pratiques citationnelles allant de l'allusion vague à la citation exacte. L'étude se concentre surun effet qui résulte de la rhéthorique filmique caractéristique de Michael Moore: la mise en circulation de discours –verbaux, visuels et acoustiques– décontextualisés grâce aux techniques du montage. Dans la mesure où il s'agit, dans la plupart des cas, d'un montagediscordant, Moore parvient à transgresser les normes du genre «documentaire» pour arriver à la docu-satire.Documentaries can be seen as a social practice of filmic discourse. They are essentially based on citation techniques ranging from vague allusions to exact reproductions. The present study emphasizes a characteristic effect of Michael Moore's film rhethoric which consists in the use of montage techniques in order make –verbal, visual and acoustic– discourses circulate. However, Moore's excessiveuse of discordant montage is eventually overcoming the standards of documentary film as genre. It leads to something new we would like to call docu-satire.

  4. Beyond the double banana

    DEFF Research Database (Denmark)

    Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger

    2014-01-01

    PURPOSE: To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source......). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical...... performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). RESULTS: Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest...

  5. Can Film Show the Invisible? The Work of Montage in Ethnographic Filmmaking

    DEFF Research Database (Denmark)

    Suhr, Christian; Willerslev, Rane

    2012-01-01

    This article suggests that film can evoke hidden dimensions of ethnographic reality, not by striving for ever more realistic depictions – a position often associated with observational cinema – but rather by exploiting the artificial means through which human vision can be transcended. Achieved...... particularly through the use of montage, such disruptions can multiply the perspectives from which filmic subject matter is perceived, thus conveying its invisible and irreducible otherness. This, however, is an argument not for dismissing the realism of much ethnographic filmmaking, but rather to demonstrate...

  6. How do reference montage and electrodes setup affect the measured scalp EEG potentials?

    Science.gov (United States)

    Hu, Shiang; Lai, Yongxiu; Valdes-Sosa, Pedro A.; Bringas-Vega, Maria L.; Yao, Dezhong

    2018-04-01

    Objective. Human scalp electroencephalogram (EEG) is widely applied in cognitive neuroscience and clinical studies due to its non-invasiveness and ultra-high time resolution. However, the representativeness of the measured EEG potentials for the underneath neural activities is still a problem under debate. This study aims to investigate systematically how both reference montage and electrodes setup affect the accuracy of EEG potentials. Approach. First, the standard EEG potentials are generated by the forward calculation with a single dipole in the neural source space, for eleven channel numbers (10, 16, 21, 32, 64, 85, 96, 128, 129, 257, 335). Here, the reference is the ideal infinity implicitly determined by forward theory. Then, the standard EEG potentials are transformed to recordings with different references including five mono-polar references (Left earlobe, Fz, Pz, Oz, Cz), and three re-references (linked mastoids (LM), average reference (AR) and reference electrode standardization technique (REST)). Finally, the relative errors between the standard EEG potentials and the transformed ones are evaluated in terms of channel number, scalp regions, electrodes layout, dipole source position and orientation, as well as sensor noise and head model. Main results. Mono-polar reference recordings are usually of large distortions; thus, a re-reference after online mono-polar recording should be adopted in general to mitigate this effect. Among the three re-references, REST is generally superior to AR for all factors compared, and LM performs worst. REST is insensitive to head model perturbation. AR is subject to electrodes coverage and dipole orientation but no close relation with channel number. Significance. These results indicate that REST would be the first choice of re-reference and AR may be an alternative option for high level sensor noise case. Our findings may provide the helpful suggestions on how to obtain the EEG potentials as accurately as possible for

  7. Aussteigen (getting out Impossible—Montage and Life Scenarios in Andres Veiel’s Film Black Box BRD

    Directory of Open Access Journals (Sweden)

    Anja Katharina Seiler

    2016-02-01

    Full Text Available Andres Veiel’s 2001 documentary film, Black Box BRD, links the biography of Alfred Herrhausen, RAF victim, with one of the 3rdgeneration RAF terrorists, Wolfgang Grams. In my paper, I trace how the film’s aesthetics introduce an image montage of two life scenarios by establishing both parallels and contrast, and therefore, following Susan Haywards definition “creates a third meaning” (112. I examine how the film establishes an aesthetic concept of Aussteigen (getting out—along of the alive, visible bodies—the contemporary interviewees, and dead, invisible bodies—of Herrhausen and Grams.

  8. Visualizing spikes in source-space

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Duez, Lene; Scherg, Michael

    2016-01-01

    OBJECTIVE: Reviewing magnetoencephalography (MEG) recordings is time-consuming: signals from the 306 MEG-sensors are typically reviewed divided into six arrays of 51 sensors each, thus browsing each recording six times in order to evaluate all signals. A novel method of reconstructing the MEG...... signals in source-space was developed using a source-montage of 29 brain-regions and two spatial components to remove magnetocardiographic (MKG) artefacts. Our objective was to evaluate the accuracy of reviewing MEG in source-space. METHODS: In 60 consecutive patients with epilepsy, we prospectively...... evaluated the accuracy of reviewing the MEG signals in source-space as compared to the classical method of reviewing them in sensor-space. RESULTS: All 46 spike-clusters identified in sensor-space were also identified in source-space. Two additional spike-clusters were identified in source-space. As 29...

  9. (58 Indices, Metaphors and Montages. The Heterogeneous Work in Current Latin American Literary Studies

    Directory of Open Access Journals (Sweden)

    Francisco Gelman Constantin

    2017-09-01

    Full Text Available As contemporary literary scholars challenge the ruling exclusionary criteria for the homogenization of their objects, while at the same time the biopolitical turn on literary theory criticizes representational understandings of the bond between language and the body, this paper suggests to address said relationship with recourse to the Lacanian notion of the ‘montage of heterogeneous’, which was brought forth toward a redefinition of the psychoanalytical concept of drive. Drawing from the notion of ‘heterogeneous literatures’, I advocate a theoretical genealogy from Bataille to Lacan (while Nancy, Foucault and Butler are also summoned to the discussion in order to come to terms with the rethinking of the objects for literary scholarship demanded by works such as Emilio García Wehbi’s performance piece 58 indicios sobre el cuerpo, along with his and Nora Lezano’s poetical- photographical essay Communitas.

  10. Model-Based Analysis and Optimization of the Mapping of Cortical Sources in the Spontaneous Scalp EEG

    Directory of Open Access Journals (Sweden)

    Andrei V. Sazonov

    2007-01-01

    Full Text Available The mapping of brain sources into the scalp electroencephalogram (EEG depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM is fully determined by an observation function (OF matrix. This paper analyses the OF-matrix for a generation model for the desynchronized spontaneous EEG. The model involves a four-shell spherical volume conductor containing dipolar sources that are mutually uncorrelated so as to reflect the desynchronized EEG. The reference is optimized in order to minimize the impact in the SM of the sources located distant from the electrodes. The resulting reference is called the localized reference (LR. The OF-matrix is analyzed in terms of the relative power contribution of the sources and the cross-channel correlation coefficient for five existing references as well as for the LR. It is found that the Hjorth Laplacian reference is a fair approximation of the LR, and thus is close to optimum for practical intents and purposes. The other references have a significantly poorer performance. Furthermore, the OF-matrix is analyzed for limits to the spatial resolution for the EEG. These are estimated to be around 2 cm.

  11. Producing an Infrared Multiwavelength Galactic Plane Atlas Using Montage, Pegasus, and Amazon Web Services

    Science.gov (United States)

    Rynge, M.; Juve, G.; Kinney, J.; Good, J.; Berriman, B.; Merrihew, A.; Deelman, E.

    2014-05-01

    In this paper, we describe how to leverage cloud resources to generate large-scale mosaics of the galactic plane in multiple wavelengths. Our goal is to generate a 16-wavelength infrared Atlas of the Galactic Plane at a common spatial sampling of 1 arcsec, processed so that they appear to have been measured with a single instrument. This will be achieved by using the Montage image mosaic engine process observations from the 2MASS, GLIMPSE, MIPSGAL, MSX and WISE datasets, over a wavelength range of 1 μm to 24 μm, and by using the Pegasus Workflow Management System for managing the workload. When complete, the Atlas will be made available to the community as a data product. We are generating images that cover ±180° in Galactic longitude and ±20° in Galactic latitude, to the extent permitted by the spatial coverage of each dataset. Each image will be 5°x5° in size (including an overlap of 1° with neighboring tiles), resulting in an atlas of 1,001 images. The final size will be about 50 TBs. This paper will focus on the computational challenges, solutions, and lessons learned in producing the Atlas. To manage the computation we are using the Pegasus Workflow Management System, a mature, highly fault-tolerant system now in release 4.2.2 that has found wide applicability across many science disciplines. A scientific workflow describes the dependencies between the tasks and in most cases the workflow is described as a directed acyclic graph, where the nodes are tasks and the edges denote the task dependencies. A defining property for a scientific workflow is that it manages data flow between tasks. Applied to the galactic plane project, each 5 by 5 mosaic is a Pegasus workflow. Pegasus is used to fetch the source images, execute the image mosaicking steps of Montage, and store the final outputs in a storage system. As these workflows are very I/O intensive, care has to be taken when choosing what infrastructure to execute the workflow on. In our setup, we choose

  12. Auditory mismatch negativity in schizophrenia: topographic evaluation with a high-density recording montage.

    Science.gov (United States)

    Hirayasu, Y; Potts, G F; O'Donnell, B F; Kwon, J S; Arakaki, H; Akdag, S J; Levitt, J J; Shenton, M E; McCarley, R W

    1998-09-01

    The mismatch negativity, a negative component in the auditory event-related potential, is thought to index automatic processes involved in sensory or echoic memory. The authors' goal in this study was to examine the topography of auditory mismatch negativity in schizophrenia with a high-density, 64-channel recording montage. Mismatch negativity topography was evaluated in 23 right-handed male patients with schizophrenia who were receiving medication and in 23 nonschizophrenic comparison subjects who were matched in age, handedness, and parental socioeconomic status. The Positive and Negative Syndrome Scale was used to measure psychiatric symptoms. Mismatch negativity amplitude was reduced in the patients with schizophrenia. They showed a greater left-less-than-right asymmetry than comparison subjects at homotopic electrode pairs near the parietotemporal junction. There were correlations between mismatch negativity amplitude and hallucinations at left frontal electrodes and between mismatch negativity amplitude and passive-apathetic social withdrawal at left and right frontal electrodes. Mismatch negativity was reduced in schizophrenia, especially in the left hemisphere. This finding is consistent with abnormalities of primary or adjacent auditory cortex involved in auditory sensory or echoic memory.

  13. Discours en circulation et (dé)montage filmique dans Fahrenheit 9/11 [Circulating discourse and film editing in Fahrenheit 9/11

    OpenAIRE

    Andrea Landvogt; Kathrin Sartingen

    2010-01-01

    Le film documentaire est une pratique sociale discursive médiatique qui se fonde essentiellement sur des pratiques citationnelles allant de l'allusion vague à la citation exacte. L'étude se concentre surun effet qui résulte de la rhéthorique filmique caractéristique de Michael Moore: la mise en circulation de discours –verbaux, visuels et acoustiques– décontextualisés grâce aux techniques du montage. Dans la mesure où il s'agit, dans la plupart des cas, d'un montagediscordant, Moore parvient ...

  14. Cognition through montage and mechanisms of individual memory in Bogusław Bachorczyk's art on the example of the artist's apartment-studio

    Directory of Open Access Journals (Sweden)

    Antos, Janusz

    2014-12-01

    Full Text Available The present text discusses Bogusław Bachorczyk's apartment-studio in Krakow. The decorations he has been making there since 2003 have transformed into a kind of work-in-progress. These decorations, just like Bachorczyk's art, are related to the issues of memory and identity. In 2013 he started the transformation of his apartment by "lacing up the wall" with polychrome in the library room, later to embrace also other rooms. He installed into existing polychromes new elements according to the rule of montage, which has recently constituted the basic strategy of his work.

  15. Detecting Large-Scale Brain Networks Using EEG: Impact of Electrode Density, Head Modeling and Source Localization

    Science.gov (United States)

    Liu, Quanying; Ganzetti, Marco; Wenderoth, Nicole; Mantini, Dante

    2018-01-01

    Resting state networks (RSNs) in the human brain were recently detected using high-density electroencephalography (hdEEG). This was done by using an advanced analysis workflow to estimate neural signals in the cortex and to assess functional connectivity (FC) between distant cortical regions. FC analyses were conducted either using temporal (tICA) or spatial independent component analysis (sICA). Notably, EEG-RSNs obtained with sICA were very similar to RSNs retrieved with sICA from functional magnetic resonance imaging data. It still remains to be clarified, however, what technological aspects of hdEEG acquisition and analysis primarily influence this correspondence. Here we examined to what extent the detection of EEG-RSN maps by sICA depends on the electrode density, the accuracy of the head model, and the source localization algorithm employed. Our analyses revealed that the collection of EEG data using a high-density montage is crucial for RSN detection by sICA, but also the use of appropriate methods for head modeling and source localization have a substantial effect on RSN reconstruction. Overall, our results confirm the potential of hdEEG for mapping the functional architecture of the human brain, and highlight at the same time the interplay between acquisition technology and innovative solutions in data analysis. PMID:29551969

  16. Montage, Militancy, Metaphysics: Chris Marker and André Bazin

    Directory of Open Access Journals (Sweden)

    Sarah Cooper

    2010-01-01

    Full Text Available

     

    Abstract (E: This article focuses on the relationship between the work of André Bazin and Chris Marker from the late 1940s through to the late 1950s and beyond. The division between Bazin's ŘRight Bankř affiliation with Les Cahiers du Cinéma on the one hand, and Markerřs ŘLeft Bankř allegiances on the other, is called into question here as my argument seeks to muddy the waters of their conventional ideological separation across the river Seine. Working alliteratively through Markerřs well-known talent for deft montage along with his militancy, I consider Bazinřs praise for Markerřs editing technique Ŕ in spite of famously expressing a preference elsewhere for the long take, and deep focus cinematography Ŕ and I address their political differences and convergences. Yet I also explore the rather more unexpected question of metaphysics in order to further emphasize a closer relationship between these two figures. I chart the emergence of an enduring spiritual bond between critic and filmmaker that surfaces first in Markerřs writings for the left-wing Catholic journal L’EspritThe value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation.

    Science.gov (United States)

    Shahid, Syed Salman; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity. Results illustrate the need to rationally

  17. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  18. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  19. The value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation

    Science.gov (United States)

    Salman Shahid, Syed; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Objectives. Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Approach. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Main results. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity

  1. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  2. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  3. [MEG]PLS: A pipeline for MEG data analysis and partial least squares statistics.

    Science.gov (United States)

    Cheung, Michael J; Kovačević, Natasa; Fatima, Zainab; Mišić, Bratislav; McIntosh, Anthony R

    2016-01-01

    The emphasis of modern neurobiological theories has recently shifted from the independent function of brain areas to their interactions in the context of whole-brain networks. As a result, neuroimaging methods and analyses have also increasingly focused on network discovery. Magnetoencephalography (MEG) is a neuroimaging modality that captures neural activity with a high degree of temporal specificity, providing detailed, time varying maps of neural activity. Partial least squares (PLS) analysis is a multivariate framework that can be used to isolate distributed spatiotemporal patterns of neural activity that differentiate groups or cognitive tasks, to relate neural activity to behavior, and to capture large-scale network interactions. Here we introduce [MEG]PLS, a MATLAB-based platform that streamlines MEG data preprocessing, source reconstruction and PLS analysis in a single unified framework. [MEG]PLS facilitates MRI preprocessing, including segmentation and coregistration, MEG preprocessing, including filtering, epoching, and artifact correction, MEG sensor analysis, in both time and frequency domains, MEG source analysis, including multiple head models and beamforming algorithms, and combines these with a suite of PLS analyses. The pipeline is open-source and modular, utilizing functions from FieldTrip (Donders, NL), AFNI (NIMH, USA), SPM8 (UCL, UK) and PLScmd (Baycrest, CAN), which are extensively supported and continually developed by their respective communities. [MEG]PLS is flexible, providing both a graphical user interface and command-line options, depending on the needs of the user. A visualization suite allows multiple types of data and analyses to be displayed and includes 4-D montage functionality. [MEG]PLS is freely available under the GNU public license (http://meg-pls.weebly.com). Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  5. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  6. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  7. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  8. Quantifying Porosity through Automated Image Collection and Batch Image Processing: Case Study of Three Carbonates and an Aragonite Cemented Sandstone

    Directory of Open Access Journals (Sweden)

    Jim Buckman

    2017-08-01

    Full Text Available Modern scanning electron microscopes often include software that allows for the possibility of obtaining large format high-resolution image montages over areas of several square centimeters. Such montages are typically automatically acquired and stitched, comprising many thousand individual tiled images. Images, collected over a regular grid pattern, are a rich source of information on factors such as variability in porosity and distribution of mineral phases, but can be hard to visually interpret. Additional quantitative data can be accessed through the application of image analysis. We use backscattered electron (BSE images, collected from polished thin sections of two limestone samples from the Cretaceous of Brazil, a Carboniferous limestone from Scotland, and a carbonate cemented sandstone from Northern Ireland, with up to 25,000 tiles per image, collecting numerical quantitative data on the distribution of porosity. Images were automatically collected using the FEI software Maps, batch processed by image analysis (through ImageJ, with results plotted on 2D contour plots with MATLAB. These plots numerically and visually clearly express the collected porosity data in an easily accessible form, and have application for the display of other data such as pore size, shape, grain size/shape, orientation and mineral distribution, as well as being of relevance to sandstone, mudrock and other porous media.

  9. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  10. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  11. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  12. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  13. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  14. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  15. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  16. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  17. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  18. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  19. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  20. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  1. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  2. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  3. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  4. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  5. Modelling the effect of electrode displacement on transcranial direct current stimulation (tDCS)

    Science.gov (United States)

    Ramaraju, Sriharsha; Roula, Mohammed A.; McCarthy, Peter W.

    2018-02-01

    Objective. Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers a low-intensity, direct current to cortical areas with the purpose of modulating underlying brain activity. Recent studies have reported inconsistencies in tDCS outcomes. The underlying assumption of many tDCS studies has been that replication of electrode montage equates to replicating stimulation conditions. It is possible however that anatomical difference between subjects, as well as inherent inaccuracies in montage placement, could affect current flow to targeted areas. The hypothesis that stimulation of a defined brain region will be stable under small displacements was tested. Approach. Initially, we compared the total simulated current flowing through ten specific brain areas for four commonly used tDCS montages: F3-Fp2, C3-Fp2, Fp1-F4, and P3-P4 using the software tool COMETS. The effect of a slight (~1 cm in each of four directions) anode displacement on the simulated regional current density for each of the four tDCS montages was then determined. Current flow was calculated and compared through ten segmented brain areas to determine the effect of montage type and displacement. The regional currents, as well as the localised current densities, were compared with the original electrode location, for each of these new positions. Main results. Recommendations for montages that maximise stimulation current for the ten brain regions are considered. We noted that the extent to which stimulation is affected by electrode displacement varies depending on both area and montage type. The F3-Fp2 montage was found to be the least stable with up to 38% change in average current density in the left frontal lobe while the Fp1-F4 montage was found to the most stable exhibiting only 1% change when electrodes were displaced. Significance. These results indicate that even relatively small changes in stimulation electrode placement appear to result in surprisingly large

  6. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  7. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  8. Development of procedures for spectrometer brand Spectral Products to capture spectra of incoherent optical radiation for the Laboratorio de Fotonica y Tecnologia Laser Aplicada

    International Nuclear Information System (INIS)

    Arias Avendano, Fabio Andres

    2008-01-01

    The procedure to capture spectra of incoherent optical radiation for the Laboratorio de Fotonica y Tecnologia Laser Aplicada (LAFTLA), of the Escuela de Ingenieria Electrica de la Universidad de Costa Rica is developed through the use of a spectrometer brand Spectral Products. The thorough understanding of manuals spectrometer brand Spectral Products was necessary for the satisfactory development of the project. Spectrometer and the card National Instruments are installed and run both devices with a montage of suitable laboratory. Two catches of spectrum for two different sources of optical radiation are performanced, since damages to the files .ddl precluded that the SM 240 spectrometer worked properly to take more catches to other sources of optical radiation. A final report containing the two catches is produced with the respective analysis. (author) [es

  9. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  10. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  11. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  12. Tectonics of montage

    DEFF Research Database (Denmark)

    Bundgaard, Charlotte

    2013-01-01

    We build in accordance with specific contemporary conditions, defined by production methods, construction and materials as well as ethics, meaning and values. Exactly this relationship between the work as such and the conditions behind its coming into being is a crucial point. The simultaneity of...

  13. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  14. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  15. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  16. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  17. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  18. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  19. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  20. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  1. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  2. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  3. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  4. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  5. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  6. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  7. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  8. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  9. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  10. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  11. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  12. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  13. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  14. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  15. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  16. Medio-Frontal and Anterior Temporal abnormalities in children with attention deficit hyperactivity disorder (ADHD during an acoustic antisaccade task as revealed by electro-cortical source reconstruction

    Directory of Open Access Journals (Sweden)

    Rockstroh Brigitte

    2011-01-01

    Full Text Available Abstract Background Attention Deficit Hyperactivity Disorder (ADHD is one of the most prevalent disorders in children and adolescence. Impulsivity is one of three core symptoms and likely associated with inhibition difficulties. To date the neural correlate of the antisaccade task, a test of response inhibition, has not been studied in children with (or without ADHD. Methods Antisaccade responses to visual and acoustic cues were examined in nine unmedicated boys with ADHD (mean age 122.44 ± 20.81 months and 14 healthy control children (mean age 115.64 ± 22.87 months, three girls while an electroencephalogram (EEG was recorded. Brain activity before saccade onset was reconstructed using a 23-source-montage. Results When cues were acoustic, children with ADHD had a higher source activity than control children in Medio-Frontal Cortex (MFC between -230 and -120 ms and in the left-hemispheric Temporal Anterior Cortex (TAC between -112 and 0 ms before saccade onset, despite both groups performing similarly behaviourally (antisaccades errors and saccade latency. When visual cues were used EEG-activity preceding antisaccades did not differ between groups. Conclusion Children with ADHD exhibit altered functioning of the TAC and MFC during an antisaccade task elicited by acoustic cues. Children with ADHD need more source activation to reach the same behavioural level as control children.

  17. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Science.gov (United States)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  18. Soprano and source: A laryngographic analysis

    Science.gov (United States)

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  19. Analysis of the Structure Ratios of the Funding Sources

    Directory of Open Access Journals (Sweden)

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  20. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  1. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  2. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  3. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  4. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Science.gov (United States)

    Muthuraman, Muthuraman; Hellriegel, Helge; Hoogenboom, Nienke; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan

    2014-01-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz) and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS) with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC). MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR) level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT) task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  5. How to understand Film, Video, Television

    DEFF Research Database (Denmark)

    Juel, Henrik

    2017-01-01

    Theory on how camera work, cuts, and audio-visual montage (horizontal and vertical) defines content and impact of media with moving images......Theory on how camera work, cuts, and audio-visual montage (horizontal and vertical) defines content and impact of media with moving images...

  6. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  7. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  8. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  9. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  10. Analysis of the TMI-2 source range detector response

    International Nuclear Information System (INIS)

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  11. Dosimetric analysis of radiation sources to use in dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  12. Soundwalking: Deep Listening and Spatio-Temporal Montage

    Directory of Open Access Journals (Sweden)

    Andrew Brown

    2017-08-01

    Full Text Available The bicentenary of the 1817 Pentrich Revolution provided an opportunity for the composition of a series of soundwalks that, in turn, offer themselves up as a case study in an exposition of spatial bricolage, from the perspective of an interdisciplinary artist working with the medium of locative sound. Informed by Doreen Massey’s definition of space as ‘a simultaneity of stories so far’, the author’s approach involves extracting sounds from the contemporary soundscape and re-introducing them in the form of multi-layered compositions. This article conducts an analysis of the author’s soundwalking practice according to Max van Manen’s formulation of four essential categories of experience through which to consider our ‘lived world’: spatiality, temporality, corporeality, and relationality. Drawing upon theorists whose concerns include cinematic, mobile and environmental sound, such as Chion, Chambers and Schafer, the author proposes the soundwalk as as an expanded form of cinema, with the flexibility to provoke states of immersion was well as critical detachment. A case is made for the application of the medium within the artistic investigation into ecological and socio-political issues alongside aesthetic concerns.

  13. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  14. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Directory of Open Access Journals (Sweden)

    Muthuraman Muthuraman

    Full Text Available Electroencephalography (EEG and magnetoencephalography (MEG are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC. MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  15. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    NARCIS (Netherlands)

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  16. SWOT analysis of the renewable energy sources in Romania - case study: solar energy

    Science.gov (United States)

    Lupu, A. G.; Dumencu, A.; Atanasiu, M. V.; Panaite, C. E.; Dumitrașcu, Gh; Popescu, A.

    2016-08-01

    The evolution of energy sector worldwide triggered intense preoccupation on both finding alternative renewable energy sources and environmental issues. Romania is considered to have technological potential and geographical location suitable to renewable energy usage for electricity generation. But this high potential is not fully exploited in the context of policies and regulations adopted globally, and more specific, European Union (EU) environmental and energy strategies and legislation related to renewable energy sources. This SWOT analysis of solar energy source presents the state of the art, potential and future prospects for development of renewable energy in Romania. The analysis concluded that the development of solar energy sector in Romania depends largely on: viability of legislative framework on renewable energy sources, increased subsidies for solar R&D, simplified methodology of green certificates, and educating the public, investors, developers and decision-makers.

  17. USING THE METHODS OF WAVELET ANALYSIS AND SINGULAR SPECTRUM ANALYSIS IN THE STUDY OF RADIO SOURCE BL LAC

    OpenAIRE

    Donskykh, G. I.; Ryabov, M. I.; Sukharev, A. I.; Aller, M.

    2014-01-01

    We investigated the monitoring data of extragalactic source BL Lac. This monitoring was held withUniversityofMichigan26-meter radio  telescope. To study flux density of extragalactic source BL Lac at frequencies of 14.5, 8 and 4.8 GHz, the wavelet analysis and singular spectrum analysis were used. Calculating the integral wavelet spectra allowed revealing long-term  components  (~7-8 years) and short-term components (~ 1-4 years) in BL Lac. Studying of VLBI radio maps (by the program Mojave) ...

  18. Open Source Parallel Image Analysis and Machine Learning Pipeline, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Continuum Analytics proposes a Python-based open-source data analysis machine learning pipeline toolkit for satellite data processing, weather and climate data...

  19. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  20. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  1. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  2. Time Series Analysis of Monte Carlo Fission Sources - I: Dominance Ratio Computation

    International Nuclear Information System (INIS)

    Ueki, Taro; Brown, Forrest B.; Parsons, D. Kent; Warsa, James S.

    2004-01-01

    In the nuclear engineering community, the error propagation of the Monte Carlo fission source distribution through cycles is known to be a linear Markov process when the number of histories per cycle is sufficiently large. In the statistics community, linear Markov processes with linear observation functions are known to have an autoregressive moving average (ARMA) representation of orders p and p - 1. Therefore, one can perform ARMA fitting of the binned Monte Carlo fission source in order to compute physical and statistical quantities relevant to nuclear criticality analysis. In this work, the ARMA fitting of a binary Monte Carlo fission source has been successfully developed as a method to compute the dominance ratio, i.e., the ratio of the second-largest to the largest eigenvalues. The method is free of binning mesh refinement and does not require the alteration of the basic source iteration cycle algorithm. Numerical results are presented for problems with one-group isotropic, two-group linearly anisotropic, and continuous-energy cross sections. Also, a strategy for the analysis of eigenmodes higher than the second-largest eigenvalue is demonstrated numerically

  3. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Science.gov (United States)

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  4. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  5. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    Science.gov (United States)

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  6. Dynamic Stability Analysis of Autonomous Medium-Voltage Mixed-Source Microgrid

    DEFF Research Database (Denmark)

    Zhao, Zhuoli; Yang, Ping; Guerrero, Josep M.

    2015-01-01

    -space model of the autonomous MV mixed-source microgrid containing diesel generator set (DGS), grid-supporting battery energy storage system (BESS), squirrel cage induction generator (SCIG) wind turbine and network is developed. Sensitivity analysis is carried out to reveal the dynamic stability margin...

  7. Dynamic response analysis of the LBL Advanced Light Source synchrotron radiation storage ring

    International Nuclear Information System (INIS)

    Leung, K.

    1993-05-01

    This paper presents the dynamic response analysis of the photon source synchrotron radiation storage ring excited by ground motion measured at the Lawrence Berkeley Laboratory advanced light source building site. The high spectral brilliance requirement the photon beams of the advanced light source storage ring specified displacement of the quadrupole focusing magnets in the order of 1 micron in vertical motion.There are 19 magnets supported by a 430-inch steel box beam girder. The girder and all magnets are supported by the kinematic mount system normally used in optical equipment. The kinematic mount called a six-strut magnet support system is now considered as an alternative system for supporting SSC magnets in the Super Collider. The effectively designed and effectively operated six-strut support system is now successfully operated for the Advanced Light Source (ALS) accelerator at the Lawrence Berkeley Laboratory. This paper will present the method of analysis and results of the dynamic motion study at the center of the magnets under the most critical excitation source as recorded at the LBL site

  8. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  9. Off-design performance analysis of Kalina cycle for low temperature geothermal source

    International Nuclear Information System (INIS)

    Li, Hang; Hu, Dongshuai; Wang, Mingkun; Dai, Yiping

    2016-01-01

    Highlights: • The off-design performance analysis of Kalina cycle is conducted. • The off-design models are established. • The genetic algorithm is used in the design phase. • The sliding pressure control strategy is applied. - Abstract: Low temperature geothermal sources with brilliant prospects have attracted more and more people’s attention. Kalina cycle system using ammonia water as working fluid could exploit geothermal energy effectively. In this paper, the quantitative analysis of off-design performance of Kalina cycle for the low temperature geothermal source is conducted. The off-design models including turbine, pump and heat exchangers are established preliminarily. Genetic algorithm is used to maximize the net power output and determine the thermodynamic parameters in the design phase. The sliding pressure control strategy applied widely in existing Rankine cycle power plants is adopted to response to the variations of geothermal source mass flow rate ratio (70–120%), geothermal source temperature (116–128 °C) and heat sink temperature (0–35 °C). In the off-design research scopes, the guidance for pump rotational speed adjustment is listed to provide some reference for off-design operation of geothermal power plants. The required adjustment rate of pump rotational speed is more sensitive to per unit geothermal source temperature than per unit heat sink temperature. Influence of the heat sink variation is greater than that of the geothermal source variation on the ranges of net power output and thermal efficiency.

  10. Multicriteria analysis for sources of renewable energy using data from remote sensing

    Science.gov (United States)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  11. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  12. Analysis of the monitoring system for the spallation neutron source 'SINQ'

    International Nuclear Information System (INIS)

    Badreddin, E.

    1998-01-01

    Petri Net models (PN) and Fault-Tree Analysis (FTA) are employed for the purpose of reliability analysis of the spallation neutron source SINQ. The monitoring and shut-down system (SDS) structure is investigated using a Petri-Net model. The reliability data are processed using a Fault-Tree model of the dominant part. Finally, suggestions for the improvement of system availability are made. (author)

  13. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  14. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  15. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  16. Systems analysis and engineering of the X-1 Advanced Radiation Source

    International Nuclear Information System (INIS)

    Rochau, G.E.; Hands, J.A.; Raglin, P.S.; Ramirez, J.J.

    1998-01-01

    The X-1 Advanced Radiation Source, which will produce ∼ 16 MJ in x-rays, represents the next step in providing US Department of Energy's Stockpile Stewardship program with the high-energy, large volume, laboratory x-ray sources needed for the Radiation Effects Science and Simulation (RES), Inertial Confinement Fusion (ICF), and Weapon Physics (WP) Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator in 1997 provide sufficient basis for pursuing the development of X-1. This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the systems analysis and engineering approach being used, and identify critical technology areas being researched

  17. Frontoparietal tDCS Benefits Visual Working Memory in Older Adults With Low Working Memory Capacity.

    Science.gov (United States)

    Arciniega, Hector; Gözenman, Filiz; Jones, Kevin T; Stephens, Jaclyn A; Berryhill, Marian E

    2018-01-01

    Working memory (WM) permits maintenance of information over brief delays and is an essential executive function. Unfortunately, WM is subject to age-related decline. Some evidence supports the use of transcranial direct current stimulation (tDCS) to improve visual WM. A gap in knowledge is an understanding of the mechanism characterizing these tDCS linked effects. To address this gap, we compared the effects of two tDCS montages designed on visual working memory (VWM) performance. The bifrontal montage was designed to stimulate the heightened bilateral frontal activity observed in aging adults. The unilateral frontoparietal montage was designed to stimulate activation patterns observed in young adults. Participants completed three sessions (bilateral frontal, right frontoparietal, sham) of anodal tDCS (20 min, 2 mA). During stimulation, participants performed a visual long-term memory (LTM) control task and a visual WM task. There was no effect of tDCS on the LTM task. Participants receiving right unilateral tDCS showed a WM benefit. This pattern was most robust in older adults with low WM capacity. To address the concern that the key difference between the two tDCS montages could be tDCS over the posterior parietal cortex (PPC), we included new analyses from a previous study applying tDCS targeting the PPC paired with a recognition VWM task. No significant main effects were found. A subsequent experiment in young adults found no significant effect of either tDCS montage on either task. These data indicate that tDCS montage, age and WM capacity should be considered when designing tDCS protocols. We interpret these findings as suggestive that protocols designed to restore more youthful patterns of brain activity are superior to those that compensate for age-related changes.

  18. Energy and exergy analysis of a double effect absorption refrigeration system based on different heat sources

    International Nuclear Information System (INIS)

    Kaynakli, Omer; Saka, Kenan; Kaynakli, Faruk

    2015-01-01

    Highlights: • Energy and exergy analysis was performed on double effect series flow absorption refrigeration system. • The refrigeration system runs on various heat sources such as hot water, hot air and steam. • A comparative analysis was carried out on these heat sources in terms of exergy destruction and mass flow rate of heat source. • The effect of heat sources on the exergy destruction of high pressure generator was investigated. - Abstract: Absorption refrigeration systems are environmental friendly since they can utilize industrial waste heat and/or solar energy. In terms of heat source of the systems, researchers prefer one type heat source usually such as hot water or steam. Some studies can be free from environment. In this study, energy and exergy analysis is performed on a double effect series flow absorption refrigeration system with water/lithium bromide as working fluid pair. The refrigeration system runs on various heat sources such as hot water, hot air and steam via High Pressure Generator (HPG) because of hot water/steam and hot air are the most common available heat source for absorption applications but the first law of thermodynamics may not be sufficient analyze the absorption refrigeration system and to show the difference of utilize for different type heat source. On the other hand operation temperatures of the overall system and its components have a major effect on their performance and functionality. In this regard, a parametric study conducted here to investigate this effect on heat capacity and exergy destruction of the HPG, coefficient of performance (COP) of the system, and mass flow rate of heat sources. Also, a comparative analysis is carried out on several heat sources (e.g. hot water, hot air and steam) in terms of exergy destruction and mass flow rate of heat source. From the analyses it is observed that exergy destruction of the HPG increases at higher temperature of the heat sources, condenser and absorber, and lower

  19. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    Science.gov (United States)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  20. School adjustment of children in residential care: a multi-source analysis.

    Science.gov (United States)

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  1. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  2. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  3. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  4. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  5. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  6. Bulk - Samples gamma-rays activation analysis (PGNAA) with Isotopic Neutron Sources

    International Nuclear Information System (INIS)

    HASSAN, A.M.

    2009-01-01

    An overview is given on research towards the Prompt Gamma-ray Neutron Activation Analysis (PGNAA) of bulk-samples. Some aspects in bulk-sample PGNAA are discussed, where irradiation by isotopic neutron sources is used mostly for in-situ or on-line analysis. The research was carried out in a comparative and/or qualitative way or by using a prior knowledge about the sample material. Sometimes we need to use the assumption that the mass fractions of all determined elements add up to 1. The sensitivity curves are also used for some elements in such complex samples, just to estimate the exact percentage concentration values. The uses of 252 Cf, 241 Arn/Be and 239 Pu/Be isotopic neutron sources for elemental investigation of: hematite, ilmenite, coal, petroleum, edible oils, phosphates and pollutant lake water samples have been mentioned.

  7. IQM: an extensible and portable open source application for image and signal analysis in Java.

    Science.gov (United States)

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  8. P-wave pulse analysis to retrieve source and propagation effects in the case of Vrancea earthquakes

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Placinta, A.; Grecu, B.; Radulian, M.

    2004-01-01

    Seismic source parameters and attenuation structure properties are obtained from the first P-wave pulse analysis and empirical Green's function deconvolution. The P pulse characteristics are combined effects of source and path properties. To reproduce the real source and structure parameters it is crucial to apply a method able to distinguish between the different factors affecting the observed seismograms. For example the empirical Green's function deconvolution method (Hartzell, 1978) allows the retrieval of the apparent source time function or source spectrum corrected for path, site and instrumental effects. The apparent source duration is given by the width of the deconvoluted source pulse and is directly related to the source dimension. Once the source time function established, next we can extract the parameters related to path effects. The difference between the pulse recorded at a given station and the source pulse obtained by deconvolution is a measure of the attenuation along the path from focus to the station. On the other hand, the pulse width variations with azimuth depend critically on the fault plane orientation and source directivity. In favourable circumstances (high signal/noise ratio, high resolution and station coverage), the method of analysis proposed in this paper allows the constraint of the rupture plane among the two nodal planes characterizing the fault plane solution, even for small events. P-wave pulse analysis was applied for 25 Vrancea earthquakes recorded between 1999 and 2003 by the Romanian local network to determine source parameters and attenuation properties. Our results outline high-stress drop seismic energy release with relatively simple rupture process for the considered events and strong lateral variation of attenuation of seismic waves across Carpathians Arc. (authors)

  9. Effects of prefrontal tDCS on executive function: Methodological considerations revealed by meta-analysis.

    Science.gov (United States)

    Imburgio, Michael J; Orr, Joseph M

    2018-05-01

    A meta-analysis of studies using single-session transcranial direct current stimulation (tDCS) to target the dorsolateral prefrontal cortex (DLPFC) was undertaken to examine the effect of stimulation on executive function (EF) in healthy samples. 27 studies were included in analyses, yielding 71 effect sizes. The most relevant measure for each task was determined a priori and used to calculate Hedge's g. Methodological characteristics of each study were examined individually as potential moderators of effect size. Stimulation effects on three domains of EF (inhibition of prepotent responses, mental set shifting, and information updating and monitoring) were analyzed separately. In line with previous work, the current study found no significant effect of anodal unilateral tDCS, cathodal unilateral tDCS, or bilateral tDCS on EF. Further moderator and subgroup analyses were only carried out for anodal unilateral montages due to the small number of studies using other montages. Subgroup analyses revealed a significant effect of anodal unilateral tDCS on updating tasks, but not on inhibition or set-shifting tasks. Cathode location significantly moderated the effect of anodal unilateral tDCS. Extracranial cathodes yielded a significant effect on EF while cranial cathodes yielded no effect. Anode size also significantly moderated effect of anodal unilateral tDCS, with smaller anodes being more effective than larger anodes. In summary, anodal DLPFC stimulation is more effective at improving updating ability than inhibition and set-shifting ability, but anodal stimulation can significantly improve general executive function when extracranial cathodes or small anodes are used. Future meta-analyses may examine how stimulation's effects on specific behavioral tasks, rather than broader domains, might be affected by methodological moderators. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  11. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  12. Validation of botanical origins and geographical sources of some Saudi honeys using ultraviolet spectroscopy and chemometric analysis.

    Science.gov (United States)

    Ansari, Mohammad Javed; Al-Ghamdi, Ahmad; Khan, Khalid Ali; Adgaba, Nuru; El-Ahmady, Sherweit H; Gad, Haidy A; Roshan, Abdulrahman; Meo, Sultan Ayoub; Kolyali, Sevgi

    2018-02-01

    This study aims at distinguishing honey based on botanical and geographical sources. Different floral honey samples were collected from diverse geographical locations of Saudi Arabia. UV spectroscopy in combination with chemometric analysis including Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), and Soft Independent Modeling of Class Analogy (SIMCA) were used to classify honey samples. HCA and PCA presented the initial clustering pattern to differentiate between botanical as well as geographical sources. The SIMCA model clearly separated the Ziziphus sp. and other monofloral honey samples based on different locations and botanical sources. The results successfully discriminated the honey samples of different botanical and geographical sources validating the segregation observed using few physicochemical parameters that are regularly used for discrimination.

  13. Hydrodynamic analysis of potential groundwater extraction capacity increase: case study of 'Nelt' groundwater source at Dobanovci

    Directory of Open Access Journals (Sweden)

    Bajić Dragoljub I.

    2017-01-01

    Full Text Available A comprehensive hydrodynamic analysis of the groundwater regime undertaken to assess the potential for expanding the 'Nelt' groundwater source at Dobanovci, or developing a new groundwater source for a future baby food factory, including the quantification of the impact on the production wells of the nearby 'Pepsi' groundwater source, is presented in the paper. The existing Nelt source is comprised of three active production wells that tap a subartesian aquifer formed in sands and gravelly sands; however, the analysis considers only the two nearest wells. A long-term group pumping test was con-ducted of production wells N-1 and N2 (Nelt source and production wells B-1 and B-2 (Pepsi source, while the piezometric head in the vicinity of these wells was monitored at observation well P-1, which is located in the area considered for Nelt source expansion. Data were collected at maximum pumping capacity of all the production wells. A hydrodynamic model of groundwater flow in the extended area of the Nelt source was generated for the purposes of the comprehensive hydrodynamic analysis. Hydrodynamic prognostic calculations addressed two solution alternatives for the capacity increase over a period of ten years. Licensed Visual MODFLOW Pro software, deemed to be at the very top in this field, was used for the calculations.

  14. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  15. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  16. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  17. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  18. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  19. Thermal hydraulic analysis of the encapsulated nuclear heat source

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J.; Wade, D.C. [Argonne National Lab., IL (United States)

    2001-07-01

    An analysis has been carried out of the steady state thermal hydraulic performance of the Encapsulated Nuclear Heat Source (ENHS) 125 MWt, heavy liquid metal coolant (HLMC) reactor concept at nominal operating power and shutdown decay heat levels. The analysis includes the development and application of correlation-type analytical solutions based upon first principles modeling of the ENHS concept that encompass both pure as well as gas injection augmented natural circulation conditions, and primary-to-intermediate coolant heat transfer. The results indicate that natural circulation of the primary coolant is effective in removing heat from the core and transferring it to the intermediate coolant without the attainment of excessive coolant temperatures. (authors)

  20. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    International Nuclear Information System (INIS)

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  1. Dialetical Images in the cinema

    Directory of Open Access Journals (Sweden)

    Angélica Antonechen Colombo

    2013-04-01

    Full Text Available This paper aims to discuss cinema as a Work of Art, its main elements, as the image technique, montage and its role as an essential factor of the aesthetic, perceptive and cognitive variation, from the advent and analyzing the Eisenstein’s intellectual montage, based on Walter Benjamin, Vilém Flusser and Christian Metz ’s theory.

  2. Analysis of a carbon dioxide transcritical power cycle using a low temperature source

    International Nuclear Information System (INIS)

    Cayer, Emmanuel; Galanis, Nicolas; Desilets, Martin; Nesreddine, Hakim; Roy, Philippe

    2009-01-01

    A detailed analysis of a carbon dioxide transcritical power cycle using an industrial low-grade stream of process gases as its heat source is presented. The methodology is divided in four steps: energy analysis, exergy analysis, finite size thermodynamics and calculation of the heat exchangers' surface. The results have been calculated for fixed temperature and mass flow rate of the heat source, fixed maximum and minimum temperatures in the cycle and a fixed sink temperature by varying the high pressure of the cycle and its net power output. The main results show the existence of an optimum high pressure for each of the four steps; in the first two steps, the optimum pressure maximises the thermal or exergetic efficiency while in the last two steps it minimises the product UA or the heat exchangers' surface. These high pressures are very similar for the energy and exergy analyses. The last two steps also have nearly identical optimizing high pressures that are significantly lower that the ones for the first two steps. In addition, the results show that the augmentation of the net power output produced from the limited energy source has no influence on the results of the energy analysis, decreases the exergetic efficiency and increases the heat exchangers' surface. Changing the net power output has no significant impact on the high pressures optimizing each of the four steps

  3. Application of Abaqus to analysis of the temperature field in elements heated by moving heat sources

    Directory of Open Access Journals (Sweden)

    W. Piekarska

    2010-10-01

    Full Text Available Numerical analysis of thermal phenomena occurring during laser beam heating is presented in this paper. Numerical models of surface andvolumetric heat sources were presented and the influence of different laser beam heat source power distribution on temperature field wasanalyzed. Temperature field was obtained by a numerical solution the transient heat transfer equation with activity of inner heat sources using finite element method. Temperature distribution analysis in welded joint was performed in the ABAQUS/Standard solver. The DFLUXsubroutine was used for implementation of the movable welding heat source model. Temperature-depended thermophysical properties for steelwere assumed in computer simulations. Temperature distribution in laser beam surface heated and butt welded plates was numericallyestimated.

  4. Review on solving the inverse problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  5. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  6. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    International Nuclear Information System (INIS)

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  7. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  8. Fiji: an open-source platform for biological-image analysis.

    Science.gov (United States)

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  9. The Islamic State’s Tactics in Syria: Role of Social Media in Shifting a Peaceful Arab Spring into Terrorism

    Science.gov (United States)

    2017-06-09

    and victims, such as the beheading of the Japanese citizens Haruna Yukawa and Kenji Goto in January.101 In February, It killed the U.S. aid worker...Baghdad Sulil al-Sowarm, which means “Screech for Invading 175 Elements of Cinema , “Montage...accessed April 24, 2017, http://elementsofcinema.com/editing/montage.html. 176 Elements of Cinema , “Documentary Filmmaking,” accessed April 24, 2017

  10. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  11. Surface-Source Downhole Seismic Analysis in R

    Science.gov (United States)

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  12. System optimization for continuous on-stream elemental analysis using low-output isotopic neutron sources

    International Nuclear Information System (INIS)

    Rizk, R.A.M.

    1989-01-01

    In continuous on-stream neutron activation analysis, the material to be analyzed may be continuously recirculated in a closed loop system between an activation source and a shielded detector. In this paper an analytical formulation of the detector response for such a system is presented. This formulation should be useful in optimizing the system design parameters for specific applications. A study has been made of all parameters that influence the detector response during on-stream analysis. Feasibility applications of the method to solutions of manganese and vanadium using a 5 μg 252 Cf neutron source are demonstrated. (author)

  13. Meta-analysis on Methane Mitigating Properties of Saponin-rich Sources in the Rumen: Influence of Addition Levels and Plant Sources

    Directory of Open Access Journals (Sweden)

    Anuraga Jayanegara

    2014-10-01

    Full Text Available Saponins have been considered as promising natural substances for mitigating methane emissions from ruminants. However, studies reported that addition of saponin-rich sources often arrived at contrasting results, i.e. either it decreased methane or it did not. The aim of the present study was to assess ruminal methane emissions through a meta-analytical approach of integrating related studies from published papers which described various levels of different saponin-rich sources being added to ruminant feed. A database was constructed from published literature reporting the addition of saponin-rich sources at various levels and then monitoring ruminal methane emissions in vitro. Accordingly, levels of saponin-rich source additions as well as different saponin sources were specified in the database. Apart from methane, other related rumen fermentation parameters were also included in the database, i.e. organic matter digestibility, gas production, pH, ammonia concentration, short-chain fatty acid profiles and protozoal count. A total of 23 studies comprised of 89 data points met the inclusion criteria. The data obtained were subsequently subjected to a statistical meta-analysis based on mixed model methodology. Accordingly, different studies were treated as random effects whereas levels of saponin-rich source additions or different saponin sources were considered as fixed effects. Model statistics used were p-value and root mean square error. Results showed that an addition of increasing levels of a saponin-rich source decreased methane emission per unit of substrate incubated as well as per unit of total gas produced (ptea>quillaja, statistically they did not differ each other. It can be concluded that methane mitigating properties of saponins in the rumen are level- and source-dependent.

  14. Modeling and analysis of a transcritical rankine power cycle with a low grade heat source

    DEFF Research Database (Denmark)

    Nguyen, Chan; Veje, Christian

    efficiency, exergetic efficiency and specific net power output. A generic cycle configuration has been used for analysis of a geothermal energy heat source. This model has been validated against similar calculations using industrial waste heat as the energy source. Calculations are done with fixed...

  15. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  16. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  17. Polarisation analysis of elastic neutron scattering using a filter spectrometer on a pulsed source

    International Nuclear Information System (INIS)

    Mayers, J.; Williams, W.G.

    1981-05-01

    The experimental and theoretical aspects of the polarisation analysis technique in elastic neutron scattering are described. An outline design is presented for a filter polarisation analysis spectrometer on the Rutherford Laboratory Spallation Neutron Source and estimates made of its expected count rates and resolution. (author)

  18. Analysis of filtration properties of locally sourced base oil for the ...

    African Journals Online (AJOL)

    This study examines the use of locally sourced oil like, groundnut oil, melon oil, vegetable oil, soya oil and palm oil as substitute for diesel oil in formulating oil base drilling fluids relative to filtration properties. The filtrate volumes of each of the oils were obtained for filtration control analysis. With increasing potash and ...

  19. Fiber Based Mid Infrared Supercontinuum Source for Spectroscopic Analysis in Food Production

    DEFF Research Database (Denmark)

    Ramsay, Jacob; Dupont, Sune Vestergaard Lund; Keiding, Søren Rud

    Optimization of sustainable food production is a worldwide challenge that is undergoing continuous development as new technologies emerge. Applying solutions for food analysis with novel bright and broad mid-infrared (MIR) light sources has the potential to meet the increasing demands for food...

  20. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  1. Design and analysis of nuclear battery driven by the external neutron source

    International Nuclear Information System (INIS)

    Wang, Sanbing; He, Chaohui

    2014-01-01

    Highlights: • A new type of space nuclear power called NBDEx is investigated. • NBDEx with 252 Cf has better performance than RTG with similar structure. • Its thermal power gets great improvement with increment of fuel enrichment. • The service life of NBDEx is about 2.96 year. • The launch abortion accident analysis fully demonstrates the advantage of NBDEx. - Abstract: Based on the theory of ADS (Accelerator Driven Subcritical reactor), a new type of nuclear battery was investigated, which was composed of a subcritical fission module and an isotope neutron source, called NBDEx (Nuclear Battery Driven by External neutron source). According to the structure of GPHS-RTG (General Purpose Heat Source Radioisotope Thermoelectric Generator), the fuel cell model and fuel assembly model of NBDEx were set up, and then their performances were analyzed with MCNP code. From these results, it was found that the power and power density of NBDEx were almost six times higher than the RTG’s. For fully demonstrating the advantage of NBDEx, the analysis of its impact factors was performed with MCNP code, and its lifetime was also calculated using the Origen code. These results verified that NBDEx was more suitable for the space missions than RTG

  2. Spectrographic analysis of plutonium (1960); L'analyse spectrographique du plutonium (1960)

    Energy Technology Data Exchange (ETDEWEB)

    Artaud, J; Chaput, M; Robichet, J [Commissariat a l' Energie Atomique, Saclay (France).Centre d' Etudes Nucleaires

    1960-07-01

    Various possibilities for the spectrographic determination of impurities in plutonium are considered. The application of the 'copper spark' method, of sparking on graphite and of fractional distillation in the arc are described and discussed in some detail (apparatus, accessories, results obtained). (author) [French] On examine diverses possibilites pour le dosage spectrographique des impuretes dans le plutonium. On decrit et discute plus particulierement de l'application des methodes 'copper spark', de l'etincelage sur graphite et de la distillation fractionnee dans l'arc (montages, accessoires, resultats obtenus). (auteur)

  3. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  4. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    Science.gov (United States)

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  5. Analysis of rod drop and pulsed source measurements of reactivity in the Winfrith SGHWR

    International Nuclear Information System (INIS)

    Brittain, I.

    1970-05-01

    Reactivity measurements by the rod-drop and pulsed source methods in the Winfrith SGHWR are seriously affected by spatial harmonics. A method of calculation is described which enables the spatial harmonics to be calculated in non-uniform cores in two or three dimensions, and thus allows a much more rigorous analysis of the experimental results than the usual point model. The method is used to analyse all the rod-drop measurements made during commissioning of the Winfrith SGHWR, and to comment on the results of pulsed source measurements. The reactivity worths of banks of ten and twelve shut-down tubes deduced from rod-drop and pulsed source experiments are in satisfactory agreement with each other and also with AIMAZ calculated values. The ability to calculate higher spatial harmonics in nonuniform cores is thought to be new, and may have a wider application to reactor kinetics through the method of Modal Analysis. (author)

  6. Irradiation Pattern Analysis for Designing Light Sources-Based on Light Emitting Diodes

    International Nuclear Information System (INIS)

    Rojas, E.; Stolik, S.; La Rosa, J. de; Valor, A.

    2016-01-01

    Nowadays it is possible to design light sources with a specific irradiation pattern for many applications. Light Emitting Diodes present features like high luminous efficiency, durability, reliability, flexibility, among others as the result of its rapid development. In this paper the analysis of the irradiation pattern of the light emitting diodes is presented. The approximation of these irradiation patterns to both, a Lambertian, as well as a Gaussian functions for the design of light sources is proposed. Finally, the obtained results and the functionality of bringing the irradiation pattern of the light emitting diodes to these functions are discussed. (Author)

  7. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  8. Montage in future building practice

    DEFF Research Database (Denmark)

    Bundgaard, Charlotte

    2002-01-01

    In this essay Charlotte Bundgaard describes the vision of Le Corbusier's Dom-ino project as an icon of the modernist dream, strongly connected with its contemporary industrial possibilies. She proceeds to examine a number of other projects from the 1920's to the 1970's that are also based...

  9. Noise source analysis of nuclear ship Mutsu plant using multivariate autoregressive model

    International Nuclear Information System (INIS)

    Hayashi, K.; Shimazaki, J.; Shinohara, Y.

    1996-01-01

    The present study is concerned with the noise sources in N.S. Mutsu reactor plant. The noise experiments on the Mutsu plant were performed in order to investigate the plant dynamics and the effect of sea condition and and ship motion on the plant. The reactor noise signals as well as the ship motion signals were analyzed by a multivariable autoregressive (MAR) modeling method to clarify the noise sources in the reactor plant. It was confirmed from the analysis results that most of the plant variables were affected mainly by a horizontal component of the ship motion, that is the sway, through vibrations of the plant structures. Furthermore, the effect of ship motion on the reactor power was evaluated through the analysis of wave components extracted by a geometrical transform method. It was concluded that the amplitude of the reactor power oscillation was about 0.15% in normal sea condition, which was small enough for safe operation of the reactor plant. (authors)

  10. A THEORETICAL ANALYSIS OF KEY POINTS WHEN CHOOSING OPEN SOURCE ERP SYSTEMS

    Directory of Open Access Journals (Sweden)

    Fernando Gustavo Dos Santos Gripe

    2011-08-01

    Full Text Available The present work is aimed at presenting a theoretical analysis of the main features of Open Source ERP systems, herein identified as success technical factors, in order to contribute to the establishment of parameters to be used in decision-making processes when choosing a system which fulfills the organization´s needs. Initially, the life cycle of ERP systems is contextualized, highlighting the features of Open Source ERP systems. As a result, it was verified that, when carefully analyzed, these systems need further attention regarding issues of project continuity and maturity, structure, transparency, updating frequency, and support, all of which are inherent to the reality of this type of software. Nevertheless, advantages were observed in what concerns flexibility, costs, and non-discontinuity as benefits. The main goal is to broaden the discussion about the adoption of Open Source ERP systems.

  11. Phase 2 safety analysis report: National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stefan, P.

    1989-06-01

    The Phase II program was established in order to provide additional space for experiments, and also staging and equipment storage areas. It also provides additional office space and new types of advanced instrumentation for users. This document will deal with the new safety issues resulting from this extensive expansion program, and should be used as a supplement to BNL Report No. 51584 ''National Synchrotron Light Source Safety Analysis Report,'' July 1982 (hereafter referred to as the Phase I SAR). The initial NSLS facility is described in the Phase I SAR. It comprises two electron storage rings, an injection system common to both, experimental beam lines and equipment, and office and support areas, all of which are housed in a 74,000 sq. ft. building. The X-ray Ring provides for 28 primary beam ports and the VUV Ring, 16. Each port is capable of division into 2 or 3 separate beam lines. All ports receive their synchrotron light from conventional bending magnet sources, the magnets being part of the storage ring lattice. 4 refs

  12. Open source EMR software: profiling, insights and hands-on analysis.

    Science.gov (United States)

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. AtomicJ: An open source software for analysis of force curves

    Science.gov (United States)

    Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina

    2014-06-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.

  14. AtomicJ: An open source software for analysis of force curves

    International Nuclear Information System (INIS)

    Hermanowicz, Paweł; Gabryś, Halina; Sarna, Michał; Burda, Kvetoslava

    2014-01-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh

  15. Municipal solid waste source-separated collection in China: A comparative analysis

    International Nuclear Information System (INIS)

    Tai Jun; Zhang Weiqian; Che Yue; Feng Di

    2011-01-01

    A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.

  16. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.; Pastor-Galán, D.; van Hinsbergen, D.J.J.

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  17. Performance analysis and experimental study of heat-source tower solution regeneration

    International Nuclear Information System (INIS)

    Liang, Caihua; Wen, Xiantai; Liu, Chengxing; Zhang, Xiaosong

    2014-01-01

    Highlights: • Theoretical analysis is performed on the characteristics of heat-source tower. • Experimental study is performed on various rules of the solution regeneration rate. • The characteristics of solution regeneration vary widely with different demands. • Results are useful for optimizing the process of solution regeneration. - Abstract: By analyzing similarities and difference between the solution regeneration of a heat-source tower and desiccant solution regeneration, this paper points out that solution regeneration of a heat-source tower has the characteristics of small demands and that a regeneration rate is susceptible to outdoor ambient environments. A theoretical analysis is performed on the characteristics of a heat-source tower solution in different outdoor environments and different regeneration modes, and an experimental study is performed on variation rules of the solution regeneration rate of a cross-flow heat-source tower under different inlet parameters and operating parameters. The experimental results show that: in the operating regeneration mode, as the air volume was increased from 123 m 3 h −1 to 550 m 3 h −1 , the system heat transfer amount increased from 0.42 kW to 0.78 kW, and the regeneration rate increased from 0.03 g s −1 to 0.19 g s −1 . Increasing the solution flow may increase the system heat transfer amount; however, the regeneration rate decreased to a certain extent. In the regeneration mode when the system is idle, as the air volume was increased from 136 m 3 h −1 to 541 m 3 h −1 , the regeneration rate increased from 0.03 g s −1 to 0.1 g s −1 . The regeneration rate almost remained unchanged around 0.07 g s −1 as the solution flow is increased. In the regeneration mode with auxiliary heat when the system is idle, increasing the air volume and increasing the solution flow required more auxiliary heat, thereby improving the solution regeneration rate. As the auxiliary heat was increased from 0.33 k

  18. Acoustic Source Analysis of Magnetoacoustic Tomography With Magnetic Induction for Conductivity Gradual-Varying Tissues.

    Science.gov (United States)

    Wang, Jiawei; Zhou, Yuqi; Sun, Xiaodong; Ma, Qingyu; Zhang, Dong

    2016-04-01

    As a multiphysics imaging approach, magnetoacoustic tomography with magnetic induction (MAT-MI) works on the physical mechanism of magnetic excitation, acoustic vibration, and transmission. Based on the theoretical analysis of the source vibration, numerical studies are conducted to simulate the pathological changes of tissues for a single-layer cylindrical conductivity gradual-varying model and estimate the strengths of sources inside the model. The results suggest that the inner source is generated by the product of the conductivity and the curl of the induced electric intensity inside conductivity homogeneous medium, while the boundary source is produced by the cross product of the gradient of conductivity and the induced electric intensity at conductivity boundary. For a biological tissue with low conductivity, the strength of boundary source is much higher than that of the inner source only when the size of conductivity transition zone is small. In this case, the tissue can be treated as a conductivity abrupt-varying model, ignoring the influence of inner source. Otherwise, the contributions of inner and boundary sources should be evaluated together quantitatively. This study provide basis for further study of precise image reconstruction of MAT-MI for pathological tissues.

  19. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    Science.gov (United States)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  20. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G R; Gosden, C [La Trobe Univ., Bundoora, VIC (Australia); Bird, R; Hotchkis, M [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J; Torrence, R; Fullaga, R [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1994-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  1. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    International Nuclear Information System (INIS)

    Summerhayes, G.R.; Gosden, C.; Bird, R.; Hotchkis, M.; Specht, J.; Torrence, R.; Fullaga, R.

    1993-01-01

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills

  2. Obsidian sourcing studies in Papua New Guinea using PIXE-PIGME analysis

    Energy Technology Data Exchange (ETDEWEB)

    Summerhayes, G.R.; Gosden, C. [La Trobe Univ., Bundoora, VIC (Australia); Bird, R.; Hotchkis, M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Specht, J.; Torrence, R.; Fullaga, R. [Australian Museum, Sydney, NSW (Australia). Div. of Anthropology

    1993-12-31

    Over 100 obsidian samples were analysed using PIXE-PIGME in 1990. These samples were collected during intensive surveys of the source areas around Talasea, Garua Island, and the Mopir area in 1988, 1989 and 1990. A ratio combination of 9 elements were used to separate out groups as per previous studies: F/Na, Al/Na, K/Fe, Ca/Fe, Mn/Fe, Rb/Fe, Y/Zr, Sr/Fe and Zr/Fe. In spite of variations in major elements, the close agreement between results for minor and trace elements concentrations in artefacts and known source material indicates that the provenance of each artefact can be reliably determined. This conclusion provides important validation of the use of ion beam analysis in artefact characterisation. ills.

  3. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    Science.gov (United States)

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  4. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Helmus, Jonathan J., E-mail: jjhelmus@gmail.com [Argonne National Laboratory, Environmental Science Division (United States); Jaroniec, Christopher P., E-mail: jaroniec@chemistry.ohio-state.edu [Ohio State University, Department of Chemistry and Biochemistry (United States)

    2013-04-15

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  5. Nmrglue: an open source Python package for the analysis of multidimensional NMR data

    International Nuclear Information System (INIS)

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.comhttp://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  6. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    Science.gov (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  7. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  8. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  9. Multivariate spectral-analysis of movement-related EEG data

    International Nuclear Information System (INIS)

    Andrew, C. M.

    1997-01-01

    The univariate method of event-related desynchronization (ERD) analysis, which quantifies the temporal evolution of power within specific frequency bands from electroencephalographic (EEG) data recorded during a task or event, is extended to an event related multivariate spectral analysis method. With this method, time courses of cross-spectra, phase spectra, coherence spectra, band-averaged coherence values (event-related coherence, ERCoh), partial power spectra and partial coherence spectra are estimated from an ensemble of multivariate event-related EEG trials. This provides a means of investigating relationships between EEG signals recorded over different scalp areas during the performance of a task or the occurrence of an event. The multivariate spectral analysis method is applied to EEG data recorded during three different movement-related studies involving discrete right index finger movements. The first study investigates the impact of the EEG derivation type on the temporal evolution of interhemispheric coherence between activity recorded at electrodes overlying the left and right sensorimotor hand areas during cued finger movement. The question results whether changes in coherence necessarily reflect changes in functional coupling of the cortical structures underlying the recording electrodes. The method is applied to data recorded during voluntary finger movement and a hypothesis, based on an existing global/local model of neocortical dynamics, is formulated to explain the coherence results. The third study applies partial spectral analysis too, and investigates phase relationships of, movement-related data recorded from a full head montage, thereby providing further results strengthening the global/local hypothesis. (author)

  10. THE CHANDRA SOURCE CATALOG

    International Nuclear Information System (INIS)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Harbo, Peter N.; He Xiangqun; Karovska, Margarita; Kashyap, Vinay L.; Davis, John E.; Houck, John C.; Hall, Diane M.

    2010-01-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents ∼<30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of ∼<1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  11. The Chandra Source Catalog

    Science.gov (United States)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2010-07-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  12. Sensitivity analysis of the relationship between disease occurrence and distance from a putative source of pollution

    Directory of Open Access Journals (Sweden)

    Emanuela Dreassi

    2008-05-01

    Full Text Available The relation between disease risk and a point source of pollution is usually investigated using distance from the source as a proxy of exposure. The analysis may be based on case-control data or on aggregated data. The definition of the function relating risk of disease and distance is critical, both in a classical and in a Bayesian framework, because the likelihood is usually very flat, even with large amounts of data. In this paper we investigate how the specification of the function relating risk of disease with distance from the source and of the prior distributions on the parameters of the function affects the results when case-control data and Bayesian methods are used. We consider different popular parametric models for the risk distance function in a Bayesian approach, comparing estimates with those derived by maximum likelihood. As an example we have analyzed the relationship between a putative source of environmental pollution (an asbestos cement plant and the occurrence of pleural malignant mesothelioma in the area of Casale Monferrato (Italy in 1987-1993. Risk of pleural malignant mesothelioma turns out to be strongly related to distance from the asbestos cement plant. However, as the models appeared to be sensitive to modeling choices, we suggest that any analysis of disease risk around a putative source should be integrated with a careful sensitivity analysis and possibly with prior knowledge. The choice of prior distribution is extremely important and should be based on epidemiological considerations.

  13. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This project had three major objectives. The first objective was to develop a fossil fuel combustion source inventory (NO/sub x/, SO/sub x/, and hydrocarbon emissions) that would be relatively easy to use and update for analyzing the impact of combustion emissions on acid deposition in the eastern United States. The second objective of the project was to use the inventory data as a basis for selection of a number of areas that, by virtue of their importance in the acid rain issue, could be further studied to assess the impact of local and intraregional combustion sources. The third objective was to conduct an analysis of wet deposition monitoring data in the areas under study, along with pertinent physical characteristics, meteorological conditions, and emission patterns of these areas, to investigate probable relationships between local and intraregional combustion sources and the deposition of acidic material. The combustion source emissions inventory has been developed for the eastern United States. It characterizes all important area sources and point sources on a county-by-county basis. Its design provides flexibility and simplicity and makes it uniquely useful in overall analysis of emission patterns in the eastern United States. Three regions with basically different emission patterns have been identified and characterized. The statistical analysis of wet deposition monitoring data in conjunction with emission patterns, wind direction, and topography has produced consistent results for each study area and has demonstrated that the wet deposition in each area reflects the characteristics of the localized area around the monitoring sites (typically 50 to 150 miles). 8 references, 28 figures, 39 tables.

  14. Spectrum analysis of a voltage source converter due to semiconductor voltage drops

    DEFF Research Database (Denmark)

    Rasmussen, Tonny Wederberg; Eltouki, Mustafa

    2017-01-01

    It is known that power electronic voltage source converters are non-ideal. This paper presents a state-of-the-art review on the effect of semiconductor voltage drop on the output voltage spectrum, using single-phase H-bridge two-level converter topology with natural sampled pulse width modulation....... The paper describes the analysis of output voltage spectrum, when the semiconductor voltage drop is added. The results of the analysis of the spectral contribution including and excluding semiconductor voltage drop reveal a good agreement between the theoretical results, simulations and laboratory...

  15. Design and study of the performance of a Raman lidar model, combining a pulsed laser source and a holographic grating double monochromator; Realisation et etudes des performances d'une maquette de lidar Raman combinant une source laser impulsionnelle et un double monochromateur a reseaux holographiques

    Energy Technology Data Exchange (ETDEWEB)

    Nacass, Philippe

    1976-03-16

    The various techniques for the analysis of air constituents are studied briefly to help design an apparatus for detecting, localizing, identifying and measuring atmospheric pollution. The optical methods known under the name of Lidar (Light direction and ranging) appear to give good qualitative and quantitative results since they do not involve any sampling of the observed medium. Amongst these methods, the Raman laser back-scattering in which the characteristic frequency of a molecule can be isolated from those of the other constituents of air is studied in more details. The design and realization, based on the conclusions of this study, and the measurements of the performance of a Raman Lidar preliminary model are then described. Its originality lies in the use of holographic grating monochromators and the overall simplicity of operation of the system. Using this system, it was possible to make in-situ Raman back-scattering measurements on N{sub 2}, O{sub 2}, H{sub 2}O in the atmosphere and on large concentrations of CO{sub 2} at distances between 30 and 40 m, which give a reasonable estimate of the sensitivity and of the range of a full scale, more performing final design. (author) [French] En vue de la realisation d'un dispositif permettant la detection, la localisation, l'identification et le dosage a distance de la pollution atmospherique, les differentes techniques d'analyse des constituants de l'air sont etudiees rapidement. Les methodes optiques appelees Lidar (Light Detection And Ranging) paraissent les plus adaptees pour des mesures qualitatives et quantitatives, car elles ne necessitent pas de prelevement du milieu observe. Parmi ces methodes, la retrodiffusion Raman Laser, qui permet d'isoler la frequence propre caracteristique d'une molecule sans interference avec les autres constituants de l'air est etudiee plus en details. La realisation, basee sur les conclusions de cette etude, puis la mesure des performances d'une maquette preliminaire de Lidar

  16. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  17. Regional Moment Tensor Source-Type Discrimination Analysis

    Science.gov (United States)

    2015-11-16

    unique normalized eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson...Solutions color-coded by variance reduction (VR) pre- sented on the Tape and Tape (2012a) and Tape and Tape (2012b) Lune . The white circle...eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson source-type plot (Hudson

  18. Multi-Criteria Analysis to Prioritize Energy Sources for Ambience in Poultry Production

    Directory of Open Access Journals (Sweden)

    DC Collatto

    Full Text Available ABSTRACT This paper intends to outline a model of multi-criteria analysis to pinpoint the most suitable energy source for heating aviaries in poultry broiler production from the point of view of the farmer and under environmental logic. Therefore, the identification of criteria was enabled through an exploratory study in three poultry broiler production units located in the mountain region of Rio Grande do Sul. In order to identify the energy source, the Analytic Hierarchy Process was applied. The criteria determined and validated in the research contemplated the cost of energy source, leadtime, investment in equipment, energy efficiency, quality of life and environmental impacts. The result of applying the method revealed firewood as the most appropriate energy for heating. The decision support model developed could be replicated in order to strengthen the criteria and energy alternatives presented, besides identifying new criteria and alternatives that were not considered in this study.

  19. Application of Open Source Technologies for Oceanographic Data Analysis

    Science.gov (United States)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  20. Analysis of source regions and meteorological factors for the variability of spring PM10 concentrations in Seoul, Korea

    Science.gov (United States)

    Lee, Jangho; Kim, Kwang-Yul

    2018-02-01

    CSEOF analysis is applied for the springtime (March, April, May) daily PM10 concentrations measured at 23 Ministry of Environment stations in Seoul, Korea for the period of 2003-2012. Six meteorological variables at 12 pressure levels are also acquired from the ERA Interim reanalysis datasets. CSEOF analysis is conducted for each meteorological variable over East Asia. Regression analysis is conducted in CSEOF space between the PM10 concentrations and individual meteorological variables to identify associated atmospheric conditions for each CSEOF mode. By adding the regressed loading vectors with the mean meteorological fields, the daily atmospheric conditions are obtained for the first five CSEOF modes. Then, HYSPLIT model is run with the atmospheric conditions for each CSEOF mode in order to back trace the air parcels and dust reaching Seoul. The K-means clustering algorithm is applied to identify major source regions for each CSEOF mode of the PM10 concentrations in Seoul. Three main source regions identified based on the mean fields are: (1) northern Taklamakan Desert (NTD), (2) Gobi Desert and (GD), and (3) East China industrial area (ECI). The main source regions for the mean meteorological fields are consistent with those of previous study; 41% of the source locations are located in GD followed by ECI (37%) and NTD (21%). Back trajectory calculations based on CSEOF analysis of meteorological variables identify distinct source characteristics associated with each CSEOF mode and greatly facilitate the interpretation of the PM10 variability in Seoul in terms of transportation route and meteorological conditions including the source area.

  1. OVAS: an open-source variant analysis suite with inheritance modelling.

    Science.gov (United States)

    Mozere, Monika; Tekman, Mehmet; Kari, Jameela; Bockenhauer, Detlef; Kleta, Robert; Stanescu, Horia

    2018-02-08

    The advent of modern high-throughput genetics continually broadens the gap between the rising volume of sequencing data, and the tools required to process them. The need to pinpoint a small subset of functionally important variants has now shifted towards identifying the critical differences between normal variants and disease-causing ones. The ever-increasing reliance on cloud-based services for sequence analysis and the non-transparent methods they utilize has prompted the need for more in-situ services that can provide a safer and more accessible environment to process patient data, especially in circumstances where continuous internet usage is limited. To address these issues, we herein propose our standalone Open-source Variant Analysis Sequencing (OVAS) pipeline; consisting of three key stages of processing that pertain to the separate modes of annotation, filtering, and interpretation. Core annotation performs variant-mapping to gene-isoforms at the exon/intron level, append functional data pertaining the type of variant mutation, and determine hetero/homozygosity. An extensive inheritance-modelling module in conjunction with 11 other filtering components can be used in sequence ranging from single quality control to multi-file penetrance model specifics such as X-linked recessive or mosaicism. Depending on the type of interpretation required, additional annotation is performed to identify organ specificity through gene expression and protein domains. In the course of this paper we analysed an autosomal recessive case study. OVAS made effective use of the filtering modules to recapitulate the results of the study by identifying the prescribed compound-heterozygous disease pattern from exome-capture sequence input samples. OVAS is an offline open-source modular-driven analysis environment designed to annotate and extract useful variants from Variant Call Format (VCF) files, and process them under an inheritance context through a top-down filtering schema of

  2. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium-Beryllium source

    Energy Technology Data Exchange (ETDEWEB)

    Didi, Abdessamad; Dadouch, Ahmed; Tajmouati, Jaouad; Bekkouri, Hassane [Advanced Technology and Integration System, Dept. of Physics, Faculty of Science Dhar Mehraz, University Sidi Mohamed Ben Abdellah, Fez (Morocco); Jai, Otman [Laboratory of Radiation and Nuclear Systems, Dept. of Physics, Faculty of Sciences, Tetouan (Morocco)

    2017-06-15

    Americium–beryllium (Am-Be; n, γ) is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci), yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources) experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  3. Problems of accuracy and sources of error in trace analysis of elements

    International Nuclear Information System (INIS)

    Porat, Ze'ev.

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,

  4. Problems of accuracy and sources of error in trace analysis of elements

    Energy Technology Data Exchange (ETDEWEB)

    Porat, Ze` ev

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,.

  5. Source Apportionment and Influencing Factor Analysis of Residential Indoor PM2.5 in Beijing

    Science.gov (United States)

    Yang, Yibing; Liu, Liu; Xu, Chunyu; Li, Na; Liu, Zhe; Wang, Qin; Xu, Dongqun

    2018-01-01

    In order to identify the sources of indoor PM2.5 and to check which factors influence the concentration of indoor PM2.5 and chemical elements, indoor concentrations of PM2.5 and its related elements in residential houses in Beijing were explored. Indoor and outdoor PM2.5 samples that were monitored continuously for one week were collected. Indoor and outdoor concentrations of PM2.5 and 15 elements (Al, As, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, Pb, Se, Tl, V, Zn) were calculated and compared. The median indoor concentration of PM2.5 was 57.64 μg/m3. For elements in indoor PM2.5, Cd and As may be sensitive to indoor smoking, Zn, Ca and Al may be related to indoor sources other than smoking, Pb, V and Se may mainly come from outdoor. Five factors were extracted for indoor PM2.5 by factor analysis, explained 76.8% of total variance, outdoor sources contributed more than indoor sources. Multiple linear regression analysis for indoor PM2.5, Cd and Pb was performed. Indoor PM2.5 was influenced by factors including outdoor PM2.5, smoking during sampling, outdoor temperature and time of air conditioner use. Indoor Cd was affected by factors including smoking during sampling, outdoor Cd and building age. Indoor Pb concentration was associated with factors including outdoor Pb and time of window open per day, building age and RH. In conclusion, indoor PM2.5 mainly comes from outdoor sources, and the contributions of indoor sources also cannot be ignored. Factors associated indoor and outdoor air exchange can influence the concentrations of indoor PM2.5 and its constituents. PMID:29621164

  6. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  7. Arguments and sources on Italian online forums on childhood vaccinations: Results of a content analysis.

    Science.gov (United States)

    Fadda, Marta; Allam, Ahmed; Schulz, Peter J

    2015-12-16

    Despite being committed to the immunization agenda set by the WHO, Italy is currently experiencing decreasing vaccination rates and increasing incidence of vaccine-preventable diseases. Our aim is to analyze Italian online debates on pediatric immunizations through a content analytic approach in order to quantitatively evaluate and summarize users' arguments and information sources. Threads were extracted from 3 Italian forums. Threads had to include the keyword Vaccin* in the title, focus on childhood vaccination, and include at least 10 posts. They had to have been started between 2008 and June 2014. High inter-coder reliability was achieved. Exploratory analysis using k-means clustering was performed to identify users' posting patterns for arguments about vaccines and sources. The analysis included 6544 posts mentioning 6223 arguments about pediatric vaccinations and citing 4067 sources. The analysis of argument posting patterns included users who published a sufficient number of posts; they generated 85% of all arguments on the forum. Dominating patterns of three groups were identified: (1) an anti-vaccination group (n=280) posted arguments against vaccinations, (2) a general pro-vaccination group (n=222) posted substantially diverse arguments supporting vaccination and (3) a safety-focused pro-vaccination group (n=158) mainly forwarded arguments that questioned the negative side effects of vaccination. The anti-vaccination group was shown to be more active than the others. They use multiple sources, own experience and media as their cited sources of information. Medical professionals were among the cited sources of all three groups, suggesting that vaccination-adverse professionals are gaining attention. Knowing which information is shared online on the topic of pediatric vaccinations could shed light on why immunization rates have been decreasing and what strategies would be best suited to address parental concerns. This suggests there is a high need for

  8. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-01-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  9. Sources of Safety Data and Statistical Strategies for Design and Analysis: Clinical Trials.

    Science.gov (United States)

    Zink, Richard C; Marchenko, Olga; Sanchez-Kam, Matilde; Ma, Haijun; Jiang, Qi

    2018-03-01

    There has been an increased emphasis on the proactive and comprehensive evaluation of safety endpoints to ensure patient well-being throughout the medical product life cycle. In fact, depending on the severity of the underlying disease, it is important to plan for a comprehensive safety evaluation at the start of any development program. Statisticians should be intimately involved in this process and contribute their expertise to study design, safety data collection, analysis, reporting (including data visualization), and interpretation. In this manuscript, we review the challenges associated with the analysis of safety endpoints and describe the safety data that are available to influence the design and analysis of premarket clinical trials. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from clinical trials compared to other sources. Clinical trials are an important source of safety data that contribute to the totality of safety information available to generate evidence for regulators, sponsors, payers, physicians, and patients. This work is a result of the efforts of the American Statistical Association Biopharmaceutical Section Safety Working Group.

  10. Automatic multimodal detection for long-term seizure documentation in epilepsy.

    Science.gov (United States)

    Fürbass, F; Kampusch, S; Kaniusas, E; Koren, J; Pirker, S; Hopfengärtner, R; Stefan, H; Kluge, T; Baumgartner, C

    2017-08-01

    This study investigated sensitivity and false detection rate of a multimodal automatic seizure detection algorithm and the applicability to reduced electrode montages for long-term seizure documentation in epilepsy patients. An automatic seizure detection algorithm based on EEG, EMG, and ECG signals was developed. EEG/ECG recordings of 92 patients from two epilepsy monitoring units including 494 seizures were used to assess detection performance. EMG data were extracted by bandpass filtering of EEG signals. Sensitivity and false detection rate were evaluated for each signal modality and for reduced electrode montages. All focal seizures evolving to bilateral tonic-clonic (BTCS, n=50) and 89% of focal seizures (FS, n=139) were detected. Average sensitivity in temporal lobe epilepsy (TLE) patients was 94% and 74% in extratemporal lobe epilepsy (XTLE) patients. Overall detection sensitivity was 86%. Average false detection rate was 12.8 false detections in 24h (FD/24h) for TLE and 22 FD/24h in XTLE patients. Utilization of 8 frontal and temporal electrodes reduced average sensitivity from 86% to 81%. Our automatic multimodal seizure detection algorithm shows high sensitivity with full and reduced electrode montages. Evaluation of different signal modalities and electrode montages paces the way for semi-automatic seizure documentation systems. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  11. Nuclear microprobe analysis and source apportionment of individual atmospheric aerosol particles

    International Nuclear Information System (INIS)

    Artaxo, P.; Rabello, M.L.C.; Watt, F.; Grime, G.; Swietlicki, E.

    1993-01-01

    In atmospheric aerosol reserach, one key issue is to determine the sources of the airborne particles. Bulk PIXE analysis coupled with receptor modeling provides a useful, but limited view of the aerosol sources influencing one particular site or sample. The scanning nuclear microprobe (SNM) technique is a microanalytical technique that gives unique information on individual aerosol particles. In the SNM analyses a 1.0 μm size 2.4 MeV proton beam from the Oxford SNM was used. The trace elements with Z>11 were measured by the particle induced X-ray emission (PIXE) method with detection limits in the 1-10 ppm range. Carbon, nitrogen and oxygen are measured simultaneously using Rutherford backscattering spectrometry (RBS). Atmospheric aerosol particles were collected at the Brazilian Antarctic Station and at biomass burning sites in the Amazon basin tropical rain forest in Brazil. In the Antarctic samples, the sea-salt aerosol particles were clearly predominating, with NaCl and CaSO 4 as major compounds with several trace elements as Al, Si, P, K, Mn, Fe, Ni, Cu, Zn, Br, Sr, and Pb. Factor analysis of the elemental data showed the presence of four components: 1) Soil dust particles; 2) NaCl particles; 3) CaSO 4 with Sr; and 4) Br and Mg. Strontium, observed at 20-100 ppm levels, was always present in the CaSO 4 particles. The hierarchical cluster procedure gave results similar to the ones obtained through factor analysis. For the tropical rain forest biomass burning aerosol emissions, biogenic particles with a high organic content dominate the particle population, while K, P, Ca, Mg, Zn, and Si are the dominant elements. Zinc at 10-200 ppm is present in biogenic particles rich in P and K. The quantitative aspects and excellent detection limits make SNM analysis of individual aerosol particles a very powerful analytical tool. (orig.)

  12. Python Materials Genomics (pymatgen): A robust, open-source python library for materials analysis

    OpenAIRE

    Ong, Shyue Ping; Richards, William Davidson; Jain, Anubhav; Hautier, Geoffroy; Kocher, Michael; Cholia, Shreyas; Gunter, Dan; Chevrier, Vincent L.; Persson, Kristin A.; Ceder, Gerbrand

    2012-01-01

    We present the Python Materials Genomics (pymatgen) library, a robust, open-source Python library for materials analysis. A key enabler in high-throughput computational materials science efforts is a robust set of software tools to perform initial setup for the calculations (e.g., generation of structures and necessary input files) and post-calculation analysis to derive useful material properties from raw calculated data. The pymatgen library aims to meet these needs by (1) defining core Pyt...

  13. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    OpenAIRE

    Yang, Bing; Liu, Yan

    2013-01-01

    A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and ...

  14. Modeling and reliability analysis of three phase z-source AC-AC converter

    Directory of Open Access Journals (Sweden)

    Prasad Hanuman

    2017-12-01

    Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.

  15. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    Science.gov (United States)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  16. Performance analyses of Z-source and quasi Z-source inverter for photovoltaic applications

    Science.gov (United States)

    Himabind, S.; Priya, T. Hari; Manjeera, Ch.

    2018-04-01

    This paper presents the comparative analysis of Z-source and Quasi Z-source converter for renewable energy applications. Due to the dependency of renewable energy sources on external weather conditions the output voltage, current changes accordingly which effects the performance of traditional voltage source and current source inverters connected across it. To overcome the drawbacks of VSI and CSI, Z-source and Quasi Z-source inverter (QZSI) are used, which can perform multiple tasks like ac-to-dc, dc-to-ac, ac-to-ac, dc-to-dc conversion. They can be used for both buck and boost operations, by utilizing the shoot-through zero state. The QZSI is derived from the ZSI topology, with a slight change in the impedance network and it overcomes the drawbacks of ZSI. The QZSI draws a constant current from the source when compared to ZSI. A comparative analysis is performed between Z-source and Quasi Z-source inverter, simulation is performed in MATLAB/Simulink environment.

  17. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  18. International patent analysis of water source heat pump based on orbit database

    Science.gov (United States)

    Li, Na

    2018-02-01

    Using orbit database, this paper analysed the international patents of water source heat pump (WSHP) industry with patent analysis methods such as analysis of publication tendency, geographical distribution, technology leaders and top assignees. It is found that the beginning of the 21st century is a period of rapid growth of the patent application of WSHP. Germany and the United States had done researches and development of WSHP in an early time, but now Japan and China have become important countries of patent applications. China has been developing faster and faster in recent years, but the patents are concentrated in universities and urgent to be transferred. Through an objective analysis, this paper aims to provide appropriate decision references for the development of domestic WSHP industry.

  19. PIXE Analysis and source identification of airborne particulate matter collected in Downtown Havana City

    International Nuclear Information System (INIS)

    Perez, G.; Pinnera, I; Ramos, M; Guibert, R; Molina, E.; Martinez, M.; Fernandez, A.; Aldape, F.; Flores, M.

    2009-01-01

    A set of samples containing airborne particulate matter (in two particle size fraction PM10 and PM2,5) collected during five months from November 2006 to April 2007 in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique and the concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were determined consistently in both particle size fractions with minimum detection limits in the range of ng/m3. A Gent air sampler was used for the aerosol collection in PM10 and PM2,5 particles simultaneously and the PIXE elemental analysis were performed using a proton beam of 2.5 MeV from the 2 MV Van de Graff Tandetron Accelerator at the ININ PIXE Laboratory in Mexico. The analytical database provided by PIXE was statistically analyzed in order to determine the promising local pollution sources. The statistical techniques of Multivariate Factor Analysis in combination with the Principal Component Analysis methods were applied to this data and allowed identifying five main pollution sources of airborne particulate matter (PM10 and PM2,5) collected in this area. The main (local) identified sources were: soil dust, sea spray, industry, fossil fuel combustion from motor vehicles and burnings or incinerations of diverse materials. A general discussion about these results is presented in this work. (Author)

  20. An Analysis of Air Pollution in Makkah - a View Point of Source Identification

    Directory of Open Access Journals (Sweden)

    Turki M. Habeebullah

    2013-07-01

    Full Text Available Makkah is one of the busiest cities in Saudi Arabia and remains busy all year around, especially during the season of Hajj and the month of Ramadan when millions of people visit this city. This emphasizes the importance of clean air and of understanding the sources of various air pollutants, which is vital for the management and advanced modeling of air pollution. This study intends to identify the major sources of air pollutants in Makkah, near the Holy Mosque (Al-Haram using a graphical approach. Air pollutants considered in this study are nitrogen oxides (NOx, nitrogen dioxide (NO2, nitric oxide (NO, carbon monoxide (CO, sulphur dioxide (SO2, ozone (O3 and particulate matter with aero-dynamic diameter of 10 um or less (PM10. Polar plots, time variation plots and correlation analysis are used to analyse the data and identify the major sources of emissions. Most of the pollutants demonstrate high concentrations during the morning traffic peak hours, suggesting road traffic as the main source of emission. The main sources of pollutant emissions identified in Makkahwere road traffic, re-suspended and windblown dust and sand particles. Further investigation on detailedsource apportionment is required, which is part of the ongoing project.

  1. Statistical Analysis of the Microvariable AGN Source Mrk 501

    Directory of Open Access Journals (Sweden)

    Alberto C. Sadun

    2018-02-01

    Full Text Available We report on the optical observations and analysis of the high-energy peaked BL Lac object (HBL, Mrk 501, at redshift z = 0.033. We can confirm microvariable behavior over the course of minutes on several occasions per night. As an alternative to the commonly understood dynamical model of random variations in intensity of the AGN, we develop a relativistic beaming model with a minimum of free parameters, which allows us to infer changes in the line of sight angles for the motion of the different relativistic components. We hope our methods can be used in future studies of beamed emission in other active microvariable sources, similar to the one we explored.

  2. Feasibility of fissile mass assay of spent nuclear fuel using 252Cf-source-driven frequency-analysis

    International Nuclear Information System (INIS)

    Mattingly, J.K.; Valentine, T.E.; Mihalczo, J.T.

    1996-01-01

    The feasibility was evaluated using MCNP-DSP, an analog Monte Carlo transport cod to simulate source-driven measurements. Models of an isolated Westinghouse 17x17 PWR fuel assembly in a 1500-ppM borated water storage pool were used. In the models, the fuel burnup profile was represented using seven axial burnup zones, each with isotopics estimated by the PDQ code. Four different fuel assemblies with average burnups from fresh to 32 GWd/MTU were modeled and analyzed. Analysis of the fuel assemblies was simulated by inducing fission in the fuel using a 252 Cf source adjacent to the assembly and correlating source fissions with the response of a bank of 3 He detectors adjacent to the assembly opposite the source. This analysis was performed at 7 different axial positions on each of the 4 assemblies, and the source-detector cross-spectrum signature was calculated for each of these 28 simulated measurements. The magnitude of the cross-spectrum signature follows a smooth upward trend with increasing fissile material ( 235 U and 239 Pu) content, and the signature is independent of the concentration of spontaneously fissioning isotopes (e.g., 244 Cm) and (α,n) sources. Furthermore, the cross-spectrum signature is highly sensitive to changes in fissile material content. This feasibility study indicated that the signature would increase ∼100% in response to an increase of only 0.1 g/cm 3 of fissile material

  3. Molecular Ionization-Desorption Analysis Source (MIDAS) for Mass Spectrometry: Thin-Layer Chromatography

    Science.gov (United States)

    Winter, Gregory T.; Wilhide, Joshua A.; LaCourse, William R.

    2016-02-01

    Molecular ionization-desorption analysis source (MIDAS), which is a desorption atmospheric pressure chemical ionization (DAPCI) type source, for mass spectrometry has been developed as a multi-functional platform for the direct sampling of surfaces. In this article, its utility for the analysis of thin-layer chromatography (TLC) plates is highlighted. Amino acids, which are difficult to visualize without staining reagents or charring, were detected and identified directly from a TLC plate. To demonstrate the full potential of MIDAS, all active ingredients from an analgesic tablet, separated on a TLC plate, were successfully detected using both positive and negative ion modes. The identity of each of the compounds was confirmed from their mass spectra and compared against standards. Post separation, the chemical signal (blue permanent marker) as reference marks placed at the origin and solvent front were used to calculate retention factor (Rf) values from the resulting ion chromatogram. The quantitative capabilities of the device were exhibited by scanning caffeine spots on a TLC plate of increasing sample amount. A linear curve based on peak are, R2 = 0.994, was generated for seven spots ranging from 50 to 1000 ng of caffeine per spot.

  4. Prospects for accelerator neutron sources for large volume minerals analysis

    International Nuclear Information System (INIS)

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  5. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    A major challenge facing the prospective deployment of radiation detection systems for homeland security applications is the discrimination of radiological or nuclear 'threat sources' from radioactive, but benign, 'nuisance sources'. Common examples of such nuisance sources include naturally occurring radioactive material (NORM), medical patients who have received radioactive drugs for either diagnostics or treatment, and industrial sources. A sensitive detector that cannot distinguish between 'threat' and 'benign' classes will generate false positives which, if sufficiently frequent, will preclude it from being operationally deployed. In this report, we describe a first-principles physics-based modeling approach that is used to approximate the physical properties and corresponding gamma ray spectral signatures of real nuisance sources. Specific models are proposed for the three nuisance source classes - NORM, medical and industrial. The models can be validated against measured data - that is, energy spectra generated with the model can be compared to actual nuisance source data. We show by example how this is done for NORM and medical sources, using data sets obtained from spectroscopic detector deployments for cargo container screening and urban area traffic screening, respectively. In addition to capturing the range of radioactive signatures of individual nuisance sources, a nuisance source population model must generate sources with a frequency of occurrence consistent with that found in actual movement of goods and people. Measured radiation detection data can indicate these frequencies, but, at present, such data are available only for a very limited set of locations and time periods. In this report, we make more general estimates of frequencies for NORM and medical sources using a range of data sources such as shipping manifests and medical treatment statistics. We also identify potential data sources for industrial

  6. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    Science.gov (United States)

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  7. Analysis of geological material and especially ores by means of a 252Cf source

    International Nuclear Information System (INIS)

    Barrandon, J.N.; Borderie, B.; Melky, S.; Halfon, J.; Marce, A.

    1976-01-01

    Tests were made on the possibilities for analysis by 252 Cf activation in the earth sciences and mining research. The results obtained show that while 252 Cf activation can only resolve certain very specific geochemical research problems, it does allow the exact and rapid determination of numerous elements whose ores are of great economic importance such as fluorine, titanium, vanadium, manganese, copper, antimony, barium, and tungsten. The utilization of activation analysis methods in the earth sciences is not a recent phenomenon. It has generally been limited to the analysis of traces in relatively small volumes by means of irradiation in nuclear reactors. Traditional neutron sources were little used and were not very applicable. The development of 252 Cf isotopic sources emitting more intense neutron fluxes make it possible to consider carrying out more sensitive determinations without making use of a nuclear reactor. In addition, this technique can be adapted for in situ analysis in mines and mine borings. Our work which is centered upon the possibilities of instrumental laboratory analyses of geological materials through 252 Cf activation is oriented in two principal directions: the study of the experimental sensitivities of the various elements in different rocks with the usual compositions; and the study of the possibilities for routine ore analyses

  8. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  9. Radiocarbon Analysis to Calculate New End-Member Values for Biomass Burning Source Samples Specific to the Bay Area

    Science.gov (United States)

    Yoon, S.; Kirchstetter, T.; Fairley, D.; Sheesley, R. J.; Tang, X.

    2017-12-01

    Elemental carbon (EC), also known as black carbon or soot, is an important particulate air pollutant that contributes to climate forcing through absorption of solar radiation and to adverse human health impacts through inhalation. Both fossil fuel combustion and biomass burning, via residential firewood burning, agricultural burning, wild fires, and controlled burns, are significant sources of EC. Our ability to successfully control ambient EC concentrations requires understanding the contribution of these different emission sources. Radiocarbon (14C) analysis has been increasingly used as an apportionment tool to distinguish between EC from fossil fuel and biomass combustion sources. However, there are uncertainties associated with this method including: 1) uncertainty associated with the isolation of EC to be used for radiocarbon analysis (e.g., inclusion of organic carbon, blank contamination, recovery of EC, etc.) 2) uncertainty associated with the radiocarbon signature of the end member. The objective of this research project is to utilize laboratory experiments to evaluate some of these uncertainties, particularly for EC sources that significantly impact the San Francisco Bay Area. Source samples of EC only and a mix of EC and organic carbon (OC) were produced for this study to represent known emission sources and to approximate the mixing of EC and OC that would be present in the atmosphere. These samples include a combination of methane flame soot, various wood smoke samples (i.e. cedar, oak, sugar pine, pine at various ages, etc.), meat cooking, and smoldering cellulose smoke. EC fractions were isolated using a Sunset Laboratory's thermal optical transmittance carbon analyzer. For 14C analysis, samples were sent to Woods Hole Oceanographic Institution for isotope analysis using an accelerated mass spectrometry. End member values and uncertainties for the EC isolation utilizing this method will be reported.

  10. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  11. "Der unvermeidliche Goethe" : Alexander Lernet-Holenias "Der wahre Werther" im Kontext der neueren "Werther"-Rezeption

    OpenAIRE

    Hamacher, Bernd

    2008-01-01

    Lernet-Holenias „Wahrer Werther“ (1959) ist eine Montage: Der größte Teil des Buches besteht aus einer Wiedergabe der 1774 anonym erschienenen ersten Fassung von Goethes »Leiden des jungen Werthers«. Vorangestellt ist eine aus Heinrich Gloëls Buch »Goethes Wetzlarer Zeit« (1911) kompilierte Einleitung, in der die stofflich-biographischen Hintergründe des Romans aus Goethes Wetzlarer Zeit erzählt werden. Die Montage belegt, dass die Gegenreaktion gegen den „Werther“ auch in der Mitte des 20. J...

  12. A Pilot Study of EEG Source Analysis Based Repetitive Transcranial Magnetic Stimulation for the Treatment of Tinnitus.

    Directory of Open Access Journals (Sweden)

    Hui Wang

    Full Text Available Repetitive Transcranial Magnetic Stimulation (rTMS is a novel therapeutic tool to induce a suppression of tinnitus. However, the optimal target sites are unknown. We aimed to determine whether low-frequency rTMS induced lasting suppression of tinnitus by decreasing neural activity in the cortex, navigated by high-density electroencephalogram (EEG source analysis, and the utility of EEG for targeting treatment.In this controlled three-armed trial, seven normal hearing patients with tonal tinnitus received a 10-day course of 1-Hz rTMS to the cortex, navigated by high-density EEG source analysis, to the left temporoparietal cortex region, and to the left temporoparietal with sham stimulation. The Tinnitus handicap inventory (THI and a visual analog scale (VAS were used to assess tinnitus severity and loudness. Measurements were taken before, and immediately, 2 weeks, and 4 weeks after the end of the interventions.Low-frequency rTMS decreased tinnitus significantly after active, but not sham, treatment. Responders in the EEG source analysis-based rTMS group, 71.4% (5/7 patients, experienced a significant reduction in tinnitus loudness, as evidenced by VAS scores. The target site of neuronal generators most consistently associated with a positive response was the frontal lobe in the right hemisphere, sourced using high-density EEG equipment, in the tinnitus patients. After left temporoparietal rTMS stimulation, 42.8% (3/7 patients experienced a decrease in tinnitus loudness.Active EEG source analysis based rTMS resulted in significant suppression in tinnitus loudness, showing the superiority of neuronavigation-guided coil positioning in dealing with tinnitus. Non-auditory areas should be considered in the pathophysiology of tinnitus. This knowledge in turn can contribute to investigate the pathophysiology of tinnitus.

  13. Adapting Controlled-source Coherence Analysis to Dense Array Data in Earthquake Seismology

    Science.gov (United States)

    Schwarz, B.; Sigloch, K.; Nissen-Meyer, T.

    2017-12-01

    Exploration seismology deals with highly coherent wave fields generated by repeatable controlled sources and recorded by dense receiver arrays, whose geometry is tailored to back-scattered energy normally neglected in earthquake seismology. Owing to these favorable conditions, stacking and coherence analysis are routinely employed to suppress incoherent noise and regularize the data, thereby strongly contributing to the success of subsequent processing steps, including migration for the imaging of back-scattering interfaces or waveform tomography for the inversion of velocity structure. Attempts have been made to utilize wave field coherence on the length scales of passive-source seismology, e.g. for the imaging of transition-zone discontinuities or the core-mantle-boundary using reflected precursors. Results are however often deteriorated due to the sparse station coverage and interference of faint back-scattered with transmitted phases. USArray sampled wave fields generated by earthquake sources at an unprecedented density and similar array deployments are ongoing or planned in Alaska, the Alps and Canada. This makes the local coherence of earthquake data an increasingly valuable resource to exploit.Building on the experience in controlled-source surveys, we aim to extend the well-established concept of beam-forming to the richer toolbox that is nowadays used in seismic exploration. We suggest adapted strategies for local data coherence analysis, where summation is performed with operators that extract the local slope and curvature of wave fronts emerging at the receiver array. Besides estimating wave front properties, we demonstrate that the inherent data summation can also be used to generate virtual station responses at intermediate locations where no actual deployment was performed. Owing to the fact that stacking acts as a directional filter, interfering coherent wave fields can be efficiently separated from each other by means of coherent subtraction. We

  14. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  15. Noise Source Identification of a Ring-Plate Cycloid Reducer Based on Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Bing Yang

    2013-01-01

    Full Text Available A ring-plate-type cycloid speed reducer is one of the most important reducers owing to its low volume, compactness, smooth and high performance, and high reliability. The vibration and noise tests of the reducer prototype are completed using the HEAD acoustics multichannel noise test and analysis system. The characteristics of the vibration and noise are obtained based on coherence analysis and the noise sources are identified. The conclusions provide the bases for further noise research and control of the ring-plate-type cycloid reducer.

  16. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  17. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  18. Analysis of insertion device magnet measurements for the Advanced Light Source

    International Nuclear Information System (INIS)

    Marks, S.; Humphries, D.; Kincaid, B.M.; Schlueter, R.; Wang, C.

    1993-07-01

    The Advanced Light Source (ALS), which is currently being commissioned at Lawrence Berkeley Laboratory, is a third generation light source designed to produce XUV radiation of unprecedented brightness. To meet the high brightness goal the storage ring has been designed for very small electron beam emittance and the undulators installed in the ALS are built to a high degree of precision. The allowable magnetic field errors are driven by electron beam and radiation requirements. Detailed magnetic measurements and adjustments are performed on each undulator to qualify it for installation in the ALS. The first two ALS undulators, IDA and IDB, have been installed. This paper describes the program of measurements, data analysis, and adjustments carried out for these two devices. Calculations of the radiation spectrum, based upon magnetic measurements, are included. Final field integral distributions are also shown. Good field integral uniformity has been achieved using a novel correction scheme, which is also described

  19. Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis

    Science.gov (United States)

    Moridnejad, A.; Karimi, N.; Ariya, P. A.

    2014-12-01

    The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.

  20. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    Directory of Open Access Journals (Sweden)

    J. Walawender

    2008-04-01

    Full Text Available Geographical Information Systems (GIS can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data.

    The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data.

    Three selected days (30 cases with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  1. A tsunami wave propagation analysis for the Ulchin Nuclear Power Plant considering the tsunami sources of western part of Japan

    International Nuclear Information System (INIS)

    Rhee, Hyun Me; Kim, Min Kyu; Sheen, Dong Hoon; Choi, In Kil

    2013-01-01

    The accident which was caused by a tsunami and the Great East-Japan earthquake in 2011 occurred at the Fukushima Nuclear Power Plant (NPP) site. It is obvious that the NPP accident could be incurred by the tsunami. Therefore a Probabilistic Tsunami Hazard Analysis (PTHA) for an NPP site should be required in Korea. The PTHA methodology is developed on the PSHA (Probabilistic Seismic Hazard Analysis) method which is performed by using various tsunami sources and their weights. In this study, the fault sources of northwestern part of Japan were used to analyze as the tsunami sources. These fault sources were suggested by the Atomic Energy Society of Japan (AESJ). To perform the PTHA, the calculations of maximum and minimum wave elevations from the result of tsunami simulations are required. Thus, in this study, tsunami wave propagation analysis were performed for developing the future study of the PTHA

  2. Top-down regulation of left temporal cortex by hypnotic amusia for rhythm: a pilot study on mismatch negativity.

    Science.gov (United States)

    Facco, Enrico; Ermani, Mario; Rampazzo, Patrizia; Tikhonoff, Valérie; Saladini, Marina; Zanette, Gastone; Casiglia, Edoardo; Spiegel, David

    2014-01-01

    To evaluate the effect of hypnotically induced amusia for rhythm (a condition in which individuals are unable to recognize melodies or rhythms) on mismatch negativity (MMN), 5 highly (HH) and 5 poorly (LH) hypnotizable nonmusician volunteers underwent MMN recording before and during a hypnotic suggestion for amusia. MMN amplitude was recorded using a 19-channel montage and then processed using the low-resolution electromagnetic tomography (LORETA) to localize its sources. MMN amplitude was significantly decreased during hypnotic amusia (p < .04) only in HH, where the LORETA maps of MMN showed a decreased source amplitude in the left temporal lobe, suggesting a hypnotic top-down regulation of activity of these areas and that these changes can be assessed by neurophysiological investigations.

  3. Image acquisition and analysis for beam diagnostics, applications of the Taiwan photon source

    International Nuclear Information System (INIS)

    Liao, C.Y.; Chen, J.; Cheng, Y.S.; Hsu, K.T.; Hu, K.H.; Kuo, C.H.; Wu, C.Y.

    2012-01-01

    Design and implementation of image acquisition and analysis is in proceeding for the Taiwan Photon Source (TPS) diagnostic applications. The optical system contains screen, lens, and lighting system. A CCD camera with Gigabit Ethernet interface (GigE Vision) will be a standard image acquisition device. Image acquisition will be done on EPICS IOC via PV channel and analysis the properties by using Matlab tool to evaluate the beam profile (sigma), beam size position and tilt angle et al. The EPICS IOC integrated with Matlab as a data processing system is not only could be used in image analysis but also in many types of equipment data processing applications. Progress of the project will be summarized in this report. (authors)

  4. Analysis of the image of pion-emitting sources in the source center-of-mass frame

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Yanyu; Feng, Qichun; Huo, Lei; Zhang, Jingbo; Liu, Jianli; Tang, Guixin [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Zhang, Weining [Harbin Institute of Technology, Department of Physics, Harbin, Heilongjiang (China); Dalian University of Technology, School of Physics and Optoelectronic Technology, Dalian, Liaoning (China)

    2017-08-15

    In this paper, we try a method to extract the image of pion-emitting source function in the center-of-mass frame of the source (CMFS). We choose identical pion pairs according to the difference of their energy and use these pion pairs to build the correlation function. The purpose is to reduce the effect of ΔEΔt, thus the corresponding imaging result can tend to the real source function. We examine the effect of this method by comparing its results with real source functions extracted from models directly. (orig.)

  5. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-07-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), carbon monoxide (CO), ozone (O{sub 3}), aerosol scattering coefficient ({sigma}{sub sp}), aerosol number concentration (NC{sub asl}), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on

  6. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    International Nuclear Information System (INIS)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-01-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO 2 ), methane (CH 4 ), carbon monoxide (CO), ozone (O 3 ), aerosol scattering coefficient (σ sp ), aerosol number concentration (NC asl ), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on atmospheric composition in the Arctic. We

  7. Analysis of internal radiation and radiotoxicity source base on aerosol distribution in RMI

    International Nuclear Information System (INIS)

    Yuwono, I.

    2000-01-01

    Destructive testing of nuclear fuel element during post irradiation examination in radio metallurgy installation may cause air contamination in the working area in the form of radioactive aerosol. Inhalation of the radioactive aerosol by worker will to become internal radiation source. Potential hazard of radioactive particle in the body also depends on the particle size. Analysis of internal radiation source and radiotoxicity showed that in the normal operation only natural radioactive materials are found with high radiotoxicity, i.e. Pb-212 and Ac-228. High deposit in the alveolar instersial (Ai) is 95 % and lower in the bronchial area (BB) is 1 % for particle size 11.7 nm and 350 nm respectively. (author)

  8. Assessing heavy metal sources in sugarcane Brazilian soils: an approach using multivariate analysis.

    Science.gov (United States)

    da Silva, Fernando Bruno Vieira; do Nascimento, Clístenes Williams Araújo; Araújo, Paula Renata Muniz; da Silva, Luiz Henrique Vieira; da Silva, Roberto Felipe

    2016-08-01

    Brazil is the world's largest sugarcane producer and soils in the northeastern part of the country have been cultivated with the crop for over 450 years. However, so far, there has been no study on the status of heavy metal accumulation in these long-history cultivated soils. To fill the gap, we collect soil samples from 60 sugarcane fields in order to determine the contents of Cd, Cr, Cu, Ni, Pb, and Zn. We used multivariate analysis to distinguish between natural and anthropogenic sources of these metals in soils. Analytical determinations were performed in ICP-OES after microwave acid solution digestion. Mean concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were 1.9, 18.8, 6.4, 4.9, 11.2, and 16.2 mg kg(-1), respectively. The principal component one was associated with lithogenic origin and comprised the metals Cr, Cu, Ni, and Zn. Cluster analysis confirmed that 68 % of the evaluated sites have soil heavy metal concentrations close to the natural background. The Cd concentration (principal component two) was clearly associated with anthropogenic sources with P fertilization being the most likely source of Cd to soils. On the other hand, the third component (Pb concentration) indicates a mixed origin for this metal (natural and anthropogenic); hence, Pb concentrations are probably related not only to the soil parent material but also to industrial emissions and urbanization in the vicinity of the agricultural areas.

  9. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  10. Antioxidants: Characterization, natural sources, extraction and analysis

    OpenAIRE

    OROIAN, MIRCEA; Escriche Roberto, Mª Isabel

    2015-01-01

    [EN] Recently many review papers regarding antioxidants fromdifferent sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, fo...

  11. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    International Nuclear Information System (INIS)

    Eriksson, E.; Andersen, H. R.; Ledin, A.

    2008-01-01

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens

  12. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, E., E-mail: eve@env.dtu.dk; Andersen, H. R.; Ledin, A. [Technical University of Denmark, Department of Environmental Engineering (Denmark)

    2008-12-15

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens.

  13. Dialetical Images in the cinema

    OpenAIRE

    Angélica Antonechen Colombo

    2013-01-01

    This paper aims to discuss cinema as a Work of Art, its main elements, as the image technique, montage and its role as an essential factor of the aesthetic, perceptive and cognitive variation, from the advent and analyzing the Eisenstein’s intellectual montage, based on Walter Benjamin, Vilém Flusser and Christian Metz ’s theory. Discutir o cinema como obra de arte, seus principais elementos, como a imagem técnica, a montagem e o seu papel como fator essencial das variações estéticas, perc...

  14. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  15. The adoption of total cost of ownership for sourcing decisions - a structural equations analysis

    NARCIS (Netherlands)

    Wouters, Marc; Anderson, James C.; Wynstra, Finn

    2005-01-01

    This study investigates the adoption of total cost of ownership (TCO) analysis to improve sourcing decisions. TCO can be seen as an application of activity based costing (ABC) that quantifies the costs that are involved in acquiring and using purchased goods or services. TCO supports purchasing

  16. An Analysis of Source Tilting and Sub-cell Opacity Sampling for IMC

    Energy Technology Data Exchange (ETDEWEB)

    Wollaeger, Ryan T. [Los Alamos National Laboratory; Urbatsch, Todd J. [Los Alamos National Laboratory; Wollaber, Allan B. [Los Alamos National Laboratory; Densmore, Jeffery D. [Los Alamos National Laboratory

    2012-08-02

    Implicit Monte Carlo (IMC) is a stochastic method for solving the radiative transfer equations for multiphysics application with the material in local thermodynamic equilibrium. The IMC method employs a fictitious scattering term that is computed from an implicit discretization of the material temperature equation. Unfortunately, the original histogram representation of the temperature and opacity with respect to the spatial domain leads to nonphysically fast propagation of radiation waves through optically thick material. In the past, heuristic source tilting schemes have been used to mitigate the numerical teleportation error of the radiation particles in IMC that cause this overly rapid radiation wave propagation. While improving the material temperature profile throughout the time duration, these tilting schemes alone do not generally alleviate the teleportation error to suitable levels. Another means of potentially reducing teleportation error in IMC is implementing continuous sub-cell opacities based on sub-cell temperature profiles. We present here an analysis of source tilting and continuous sub-cell opacity sampling applied to various discretizations of the temperature equation. Through this analysis, we demonstrate that applying both heuristics does not necessarily yield more accurate results if the discretization of the material equation is inconsistent with the Monte Carlo sub-cell transport.

  17. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  18. Security analysis of an untrusted source for quantum key distribution: passive approach

    International Nuclear Information System (INIS)

    Zhao Yi; Qi Bing; Lo, H-K; Qian Li

    2010-01-01

    We present a passive approach to the security analysis of quantum key distribution (QKD) with an untrusted source. A complete proof of its unconditional security is also presented. This scheme has significant advantages in real-life implementations as it does not require fast optical switching or a quantum random number generator. The essential idea is to use a beam splitter to split each input pulse. We show that we can characterize the source using a cross-estimate technique without active routing of each pulse. We have derived analytical expressions for the passive estimation scheme. Moreover, using simulations, we have considered four real-life imperfections: additional loss introduced by the 'plug and play' structure, inefficiency of the intensity monitor noise of the intensity monitor, and statistical fluctuation introduced by finite data size. Our simulation results show that the passive estimate of an untrusted source remains useful in practice, despite these four imperfections. Also, we have performed preliminary experiments, confirming the utility of our proposal in real-life applications. Our proposal makes it possible to implement the 'plug and play' QKD with the security guaranteed, while keeping the implementation practical.

  19. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  20. Use of the spectral analysis for estimating the intensity of a weak periodic source

    International Nuclear Information System (INIS)

    Marseguerra, M.

    1989-01-01

    This paper deals with the possibility of exploiting spectral methods for the analysis of counting experiments in which one has to estimate the intensity of a weak periodic source of particles buried in a high background. The general theoretical expressions here obtained for the auto- and cross-spectra are applied to three kinds of simulated experiments. In all cases it turns out that the source intensity can acutally be estimated with a standard deviation comparable with that obtained in classical experiments in which the source can be moved out. Thus the spectral methods represent an interesting technique nowadays easy to implement on low-cost computers which could also be used in many research fields by suitably redesigning classical experiments. The convenience of using these methods in the field of nuclear safeguards is presently investigated in our Institute. (orig.)

  1. Determination of volatile organic compounds pollution sources in malaysian drinking water using multivariate analysis.

    Science.gov (United States)

    Soh, Shiau-Chian; Abdullah, Md Pauzi

    2007-01-01

    A field investigation was conducted at all water treatment plants throughout 11 states and Federal Territory in Peninsular Malaysia. The sampling points in this study include treatment plant operation, service reservoir outlet and auxiliary outlet point at the water pipelines. Analysis was performed by solid phase micro-extraction technique with a 100 microm polydimethylsiloxane fibre using gas chromatography with mass spectrometry detection to analyse 54 volatile organic compounds (VOCs) of different chemical families in drinking water. The concentration of VOCs ranged from undetectable to 230.2 microg/l. Among all of the VOCs species, chloroform has the highest concentration and was detected in all drinking water samples. Average concentrations of total trihalomethanes (THMs) were almost similar among all states which were in the range of 28.4--33.0 microg/l. Apart from THMs, other abundant compounds detected were cis and trans-1,2-dichloroethylene, trichloroethylene, 1,2-dibromoethane, benzene, toluene, ethylbenzene, chlorobenzene, 1,4-dichlorobenzene and 1,2-dichloro - benzene. Principal component analysis (PCA) with the aid of varimax rotation, and parallel factor analysis (PARAFAC) method were used to statistically verify the correlation between VOCs and the source of pollution. The multivariate analysis pointed out that the maintenance of auxiliary pipelines in the distribution systems is vital as it can become significant point source pollution to Malaysian drinking water.

  2. Creep analysis of fuel plates for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  3. Neutron activation analysis: Modelling studies to improve the neutron flux of Americium–Beryllium source

    Directory of Open Access Journals (Sweden)

    Abdessamad Didi

    2017-06-01

    Full Text Available Americium–beryllium (Am-Be; n, γ is a neutron emitting source used in various research fields such as chemistry, physics, geology, archaeology, medicine, and environmental monitoring, as well as in the forensic sciences. It is a mobile source of neutron activity (20 Ci, yielding a small thermal neutron flux that is water moderated. The aim of this study is to develop a model to increase the neutron thermal flux of a source such as Am-Be. This study achieved multiple advantageous results: primarily, it will help us perform neutron activation analysis. Next, it will give us the opportunity to produce radio-elements with short half-lives. Am-Be single and multisource (5 sources experiments were performed within an irradiation facility with a paraffin moderator. The resulting models mainly increase the thermal neutron flux compared to the traditional method with water moderator.

  4. Analysis of the Potential of Low-Temperature Heat Pump Energy Sources

    Directory of Open Access Journals (Sweden)

    Pavel Neuberger

    2017-11-01

    Full Text Available The paper deals with an analysis of temperatures of ground masses in the proximities of linear and slinky-type HGHE (horizontal ground heat exchanger. It evaluates and compares the potentials of HGHEs and ambient air. The reason and aim of the verification was to gain knowledge of the temperature course of the monitored low-temperature heat pump energy sources during heating periods and periods of stagnation and to analyse the knowledge in terms of the potential to use those sources for heat pumps. The study was conducted in the years 2012–2015 during three heating periods and three periods of HGHEs stagnation. The results revealed that linear HGHE had the highest temperature potential of the observed low-temperature heat pump energy sources. The average daily temperatures of the ground mass surrounding the linear HGHE were the highest ranging from 7.08 °C to 9.20 °C during the heating periods, and having the lowest temperature variation range of 12.62–15.14 K, the relative frequency of the average daily temperatures of the ground mass being the highest at 22.64% in the temperature range containing the mode of all monitored temperatures in a recorded interval of [4.10, 6.00] °C. Ambient air had lower temperature potential than the monitored HGHEs.

  5. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  6. Low Intensity Focused tDCS Over the Motor Cortex Shows Inefficacy to Improve Motor Imagery Performance

    Directory of Open Access Journals (Sweden)

    Irma N. Angulo-Sherman

    2017-07-01

    Full Text Available Transcranial direct current stimulation (tDCS is a brain stimulation technique that can enhance motor activity by stimulating the motor path. Thus, tDCS has the potential of improving the performance of brain-computer interfaces during motor neurorehabilitation. tDCS effects depend on several aspects, including the current density, which usually varies between 0.02 and 0.08 mA/cm2, and the location of the stimulation electrodes. Hence, testing tDCS montages at several current levels would allow the selection of current parameters for improving stimulation outcomes and the comparison of montages. In a previous study, we found that cortico-cerebellar tDCS shows potential of enhancing right-hand motor imagery. In this paper, we aim to evaluate the effects of the focal stimulation of the motor cortex over motor imagery. In particular, the effect of supplying tDCS with a 4 × 1 ring montage, which consists in placing an anode on the motor cortex and four cathodes around it, over motor imagery was assessed with different current densities. Electroencephalographic (EEG classification into rest or right-hand/feet motor imagery was evaluated on five healthy subjects for two stimulation schemes: applying tDCS for 10 min on the (1 right-hand or (2 feet motor cortex before EEG recording. Accuracy differences related to the tDCS intensity, as well as μ and β band power changes, were tested for each subject and tDCS modality. In addition, a simulation of the electric field induced by the montage was used to describe its effect on the brain. Results show no improvement trends on classification for the evaluated currents, which is in accordance with the observation of variable EEG band power results despite the focused stimulation. The lack of effects is probably related to the underestimation of the current intensity required to apply a particular current density for small electrodes and the relatively short inter-electrode distance. Hence, higher current

  7. A portable measurement system for subcriticality measurements by the CF-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Ragan, G.E.; Blakeman, E.D.

    1988-01-01

    A portable measurement system consisting of a personal computer used as a Fourier analyzer and three detection channels (with associated electronics that provide the signals to analog-to-digital (A/D) convertors) has been assembled to measure subcriticality by the /sup 252/Cf-source-driven neutron noise analysis method. The /sup 252/Cf-source-driven neutron noise analysis method for obtaining the subcritical neutron multiplication factor of a configuration of fissile material requires measurement of the frequency-dependent cross-power spectral density (CPSD), G/sub 23/(ω), between a pair of detectors (Nos. 2 and 3) located in or near the fissile material and CPSDs G/sub 12/(ω) and G/sub 13/(ω) between these same detectors and a source of neutrons emanating from an ionization chamber (No. 1) containing /sup 252/Cf, also positioned in or near the fissile material. The auto-power spectral density (APSD), G/sub 11/(ω), of the source is also required. A particular ratio of spectral densities, G/sub 12//sup */G/sub 13//G/sub 11/G/sub 23/ (/sup */ denotes complex conjugation), is then formed. This ratio is related to the subcritical neutron multiplication factor and is independent of detector efficiencies

  8. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  9. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  10. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  11. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  12. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  13. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  14. Fire Hazard Analysis for the Cold Neutron Source System

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-15

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area.

  15. Fire Hazard Analysis for the Cold Neutron Source System

    International Nuclear Information System (INIS)

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-01

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area

  16. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  17. Determination of sources and analysis of micro-pollutants in drinking water

    International Nuclear Information System (INIS)

    Md Pauzi Abdullah; Soh Shiau Chian

    2005-01-01

    The objectives of the study are to develop and validate selected analytical methods for the analysis of micro organics and metals in water; to identify, monitor and assess the levels of micro organics and metals in drinking water supplies; to evaluate the relevancy of the guidelines set in the National Standard of Drinking Water Quality 2001; and to identify the sources of pollution and to carryout risk assessment of exposure to drinking water. The presentation discussed the progress of the work include determination of VOCs (Volatile organic compounds) in drinking water using SPME (Solid phase micro-extraction) extraction techniques, analysis of heavy metals in drinking water, determination of Cr(VI) with ICPES (Inductively coupled plasma emission spectrometry) and the presence of halogenated volatile organic compounds (HVOCs), which is heavily used by agricultural sector, in trace concentrations in waters

  18. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Directory of Open Access Journals (Sweden)

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  19. Organic aerosol source apportionment in London 2013 with ME-2: exploring the solution space with annual and seasonal analysis

    Directory of Open Access Journals (Sweden)

    E. Reyes-Villegas

    2016-12-01

    Full Text Available The multilinear engine (ME-2 factorization tool is being widely used following the recent development of the Source Finder (SoFi interface at the Paul Scherrer Institute. However, the success of this tool, when using the a value approach, largely depends on the inputs (i.e. target profiles applied as well as the experience of the user. A strategy to explore the solution space is proposed, in which the solution that best describes the organic aerosol (OA sources is determined according to the systematic application of predefined statistical tests. This includes trilinear regression, which proves to be a useful tool for comparing different ME-2 solutions. Aerosol Chemical Speciation Monitor (ACSM measurements were carried out at the urban background site of North Kensington, London from March to December 2013, where for the first time the behaviour of OA sources and their possible environmental implications were studied using an ACSM. Five OA sources were identified: biomass burning OA (BBOA, hydrocarbon-like OA (HOA, cooking OA (COA, semivolatile oxygenated OA (SVOOA and low-volatility oxygenated OA (LVOOA. ME-2 analysis of the seasonal data sets (spring, summer and autumn showed a higher variability in the OA sources that was not detected in the combined March–December data set; this variability was explored with the triangle plots f44 : f43 f44 : f60, in which a high variation of SVOOA relative to LVOOA was observed in the f44 : f43 analysis. Hence, it was possible to conclude that, when performing source apportionment to long-term measurements, important information may be lost and this analysis should be done to short periods of time, such as seasonally. Further analysis on the atmospheric implications of these OA sources was carried out, identifying evidence of the possible contribution of heavy-duty diesel vehicles to air pollution during weekdays compared to those fuelled by petrol.

  20. Source location in plates based on the multiple sensors array method and wavelet analysis

    International Nuclear Information System (INIS)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  1. Source location in plates based on the multiple sensors array method and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  2. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  3. Application of group analysis to the spatially homogeneous and isotropic Boltzmann equation with source using its Fourier image

    International Nuclear Information System (INIS)

    Grigoriev, Yurii N; Meleshko, Sergey V; Suriyawichitseranee, Amornrat

    2015-01-01

    Group analysis of the spatially homogeneous and molecular energy dependent Boltzmann equations with source term is carried out. The Fourier transform of the Boltzmann equation with respect to the molecular velocity variable is considered. The correspondent determining equation of the admitted Lie group is reduced to a partial differential equation for the admitted source. The latter equation is analyzed by an algebraic method. A complete group classification of the Fourier transform of the Boltzmann equation with respect to a source function is given. The representation of invariant solutions and corresponding reduced equations for all obtained source functions are also presented. (paper)

  4. Analysis of primary teacher stress' sources

    Directory of Open Access Journals (Sweden)

    Katja Depolli Steiner

    2011-12-01

    Full Text Available Teachers are subject to many different work stressors. This study focused on differences in intensity and frequency of potential stressors facing primary schoolteachers and set the goal to identify the most important sources of teacher stress in primary school. The study included 242 primary schoolteachers from different parts of Slovenia. We used Stress Inventory that is designed for identification of intensity and frequency of 49 situations that can play the role of teachers' work stressors. Findings showed that the major sources of stress facing teachers are factors related to work overload, factors stemming from pupils' behaviour and motivation and factors related to school system. Results also showed some small differences in perception of stressors in different groups of teachers (by gender and by teaching level.

  5. Detrended fluctuation analysis for major depressive disorder.

    Science.gov (United States)

    Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah

    2015-01-01

    Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.

  6. Source-space ICA for MEG source imaging.

    Science.gov (United States)

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  7. Jet flow analysis of liquid poison injection in a CANDU reactor using source term

    International Nuclear Information System (INIS)

    Chae, Kyung Myung; Choi, Hang Bok; Rhee, Bo Wook

    2001-01-01

    For the performance analysis of Canadian deuterium uranium (CANDU) reactor shutdown system number 2 (SDS2), a computational fluid dynamics model of poison jet flow has been developed to estimate the flow field and poison concentration formed inside the CANDU reactor calandria. As the ratio of calandria shell radius over injection nozzle hole diameter is so large (1055), it is impractical to develop a full-size model encompassing the whole calandria shell. In order to reduce the model to a manageable size, a quarter of one-pitch length segment of the shell was modeled using symmetric nature of the jet; and the injected jet was treated as a source term to avoid the modeling difficulty caused by the big difference of the hole sizes. For the analysis of an actual CANDU-6 SDS2 poison injection, the grid structure was determined based on the results of two-dimensional real- and source-jet simulations. The maximum injection velocity of the liquid poison is 27.8 m/s and the mass fraction of the poison is 8000 ppm (mg/kg). The simulation results have shown well-established jet flow field. In general, the jet develops narrowly at first but stretches rapidly. Then, the flow recirculates a little in r-x plane, while it recirculates largely in r-θ plane. As the time goes on, the adjacent jets contact each other and form a wavy front such that the whole jet develops in a plate form. his study has shown that the source term model can be effectively used for the analysis of the poison injection and the simulation result of the CANDU reactor is consistent with the model currently being used for the safety analysis. In the future, it is strongly recommended to analyze the transient (from helium tank to injection nozzle hole) of the poison injection by applying Bernoulli equation with real boundary conditions

  8. MADAM - An open source meta-analysis toolbox for R and Bioconductor

    Directory of Open Access Journals (Sweden)

    Graber Armin

    2010-03-01

    Full Text Available Abstract Background Meta-analysis is a major theme in biomedical research. In the present paper we introduce a package for R and Bioconductor that provides useful tools for performing this type of work. One idea behind the development of MADAM was that many meta-analysis methods, which are available in R, are not able to use the capacities of parallel computing yet. In this first version, we implemented one meta-analysis method in such a parallel manner. Additionally, we provide tools for combining the results from a set of methods in an ensemble approach. Functionality for visualization of results is also provided. Results The presented package enables the carrying out of meta-analysis either by providing functions directly or by wrapping them to existing implementations. Overall, five different meta-analysis methods are now usable through MADAM, along with another three methods for combining the corresponding results. Visualizing the results is eased by three included functions. For developing and testing meta-analysis methods, a mock up data generator is integrated. Conclusions The use of MADAM enables a user to focus on one package, in turn enabling them to work with the same data types across a set of methods. By making use of the snow package, MADAM can be made compatible with an existing parallel computing infrastructure. MADAM is open source and freely available within CRAN http://cran.r-project.org.

  9. Molecular evolution in court: analysis of a large hepatitis C virus outbreak from an evolving source.

    Science.gov (United States)

    González-Candelas, Fernando; Bracho, María Alma; Wróbel, Borys; Moya, Andrés

    2013-07-19

    Molecular phylogenetic analyses are used increasingly in the epidemiological investigation of outbreaks and transmission cases involving rapidly evolving RNA viruses. Here, we present the results of such an analysis that contributed to the conviction of an anesthetist as being responsible for the infection of 275 of his patients with hepatitis C virus. We obtained sequences of the NS5B and E1-E2 regions in the viral genome for 322 patients suspected to have been infected by the doctor, and for 44 local, unrelated controls. The analysis of 4,184 cloned sequences of the E1-E2 region allowed us to exclude 47 patients from the outbreak. A subset of patients had known dates of infection. We used these data to calibrate a relaxed molecular clock and to determine a rough estimate of the time of infection for each patient. A similar analysis led to an estimate for the time of infection of the source. The date turned out to be 10 years before the detection of the outbreak. The number of patients infected was small at first, but it increased substantially in the months before the detection of the outbreak. We have developed a procedure to integrate molecular phylogenetic reconstructions of rapidly evolving viral populations into a forensic setting adequate for molecular epidemiological analysis of outbreaks and transmission events. We applied this procedure to a large outbreak of hepatitis C virus caused by a single source and the results obtained played a key role in the trial that led to the conviction of the suspected source.

  10. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  11. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  12. Open source in Finnish software companies

    OpenAIRE

    Seppä, Arto

    2006-01-01

    This paper explores survey data focusing on open source software supply collected from 170 Finnish software firms using descriptive statistical analysis. The first half of the report contains general data about software companies and the differences between proprietary and open source firms. The second half focuses on open source firms. A subject of analysis are copyrights, products and services supply, the firms’ relationships with the open source community, and their views on opportunities ...

  13. Updated pipe break analysis for Advanced Neutron Source Reactor conceptual design

    International Nuclear Information System (INIS)

    Wendel, M.W.; Chen, N.C.J.; Yoder, G.L.

    1994-01-01

    The Advanced Neutron Source Reactor (ANSR) is a research reactor to be built at the Oak Ridge National Laboratory that will supply the highest continuous neutron flux levels of any reactor in the world. It uses plate-type fuel with high-mass-flux and highly subcooled heavy water as the primary coolant. The Conceptual Safety Analysis for the ANSR was completed in June 1992. The thermal-hydraulic pipe-break safety analysis (performed with a specialized version of RELAP5/MOD3) focused primarily on double-ended guillotine breaks of the primary piping and some core-damage mitigation options for such an event. Smaller, instantaneous pipe breaks in the cold- and hot-leg piping were also analyzed to a limited extent. Since the initial analysis for the conceptual design was completed, several important changes to the RELAP5 input model have been made reflecting improvements in the fuel grading and changes in the elevation of the primary coolant pumps. Also, a new philosophy for pipe-break safety analysis (similar to that adopted for the New Production Reactor) accentuates instantaneous, limited flow area pipe-break accidents in addition to finite-opening-time, double-ended guillotine breaks of the major coolant piping. This paper discloses the results of the most recent instantaneous pipe-break calculations

  14. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  15. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  16. Preliminary radiation transport analysis for the proposed National Spallation Neutron Source (NSNS)

    International Nuclear Information System (INIS)

    Johnson, J.O.; Lillie, R.A.

    1997-01-01

    The use of neutrons in science and industry has increased continuously during the past 50 years with applications now widely used in physics, chemistry, biology, engineering, and medicine. Within this history, the relative merits of using pulsed accelerator spallation sources versus reactors for neutron sources as the preferred option for the future. To address this future need, the Department of Energy (DOE) has initiated a pre-conceptual design study for the National Spallation Neutron Source (NSNS) and given preliminary approval for the proposed facility to be built at Oak Ridge National Laboratory (ORNL). The DOE directive is to design and build a short pulse spallation source in the 1 MS power range with sufficient design flexibility that it can be upgraded and operated at a significantly higher power at a later stage. The pre-conceptualized design of the NSNS initially consists of an accelerator system capable of delivering a 1 to 2 GeV proton beam with 1 MW of beam power in an approximate 0.5 microsecond pulse at a 60 Hz frequency onto a single target station. The NSNS will be upgraded in stages to a 5 MW facility with two target stations (a high power station operating at 60 Hz and a low power station operating at 10 Hz). Each target station will contain four moderators (combinations of cryogenic and ambient temperature) and 18 beam liens for a total of 36 experiment stations. This paper summarizes the radiation transport analysis strategies for the proposed NSNS facility

  17. Modeling, analysis, and design of stationary reference frame droop controlled parallel three-phase voltage source inverters

    DEFF Research Database (Denmark)

    Vasquez, Juan Carlos; Guerrero, Josep M.; Savaghebi, Mehdi

    2013-01-01

    Power electronics based MicroGrids consist of a number of voltage source inverters (VSIs) operating in parallel. In this paper, the modeling, control design, and stability analysis of parallel connected three-phase VSIs are derived. The proposed voltage and current inner control loops and the mat......Power electronics based MicroGrids consist of a number of voltage source inverters (VSIs) operating in parallel. In this paper, the modeling, control design, and stability analysis of parallel connected three-phase VSIs are derived. The proposed voltage and current inner control loops...... control restores the frequency and amplitude deviations produced by the primary control. Also, a synchronization algorithm is presented in order to connect the MicroGrid to the grid. Experimental results are provided to validate the performance and robustness of the parallel VSI system control...

  18. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  19. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  20. Phenotypic and genotypic analysis of bio-serotypes of Yersinia enterocolitica from various sources in Brazil.

    Science.gov (United States)

    Rusak, Leonardo Alves; dos Reis, Cristhiane Moura Falavina; Barbosa, André Victor; Santos, André Felipe Mercês; Paixão, Renata; Hofer, Ernesto; Vallim, Deyse Christina; Asensi, Marise Dutra

    2014-12-15

    Yersinia enterocolitica is a well-known foodborne pathogen widely distributed in nature with high public health relevance, especially in Europe. This study aimed to analyze the pathogenic potential of Y. enterocolitica isolated strains from human, animal, food, and environmental sources and from different regions of Brazil by detecting virulence genes inv, ail, ystA, and virF through polymerase chain reaction (PCR), phenotypic tests, and antimicrobial susceptibility analysis. Pulsed-field gel electrophoresis (PFGE) was used for the assessment of phylogenetic diversity. All virulence genes were detected in 11/60 (18%) strains of serotype O:3, biotype 4 isolated from human and animal sources. Ten human strains (4/O:3) presented three chromosomal virulence genes, and nine strains of biotype 1A presented the inv gene. Six (10%) strains were resistant to sulfamethoxazole-trimethoprim, seven (12%) to tetracycline, and one (2%) to amikacin, all of which are used to treat yersiniosis. AMP-CEF-SXT was the predominant resistance profile. PFGE analysis revealed 36 unique pulsotypes, grouped into nine clusters (A to I) with similarity ≥ 85%, generating a diversity discriminatory index of 0.957. Cluster A comprised all bio-serotype 4/O:3 strains isolated from animal and humans sources. This study shows the existence of strains with the same genotypic profiles, bearing all virulence genes, from human and animal sources, circulating among several Brazilian states. This supports the hypothesis that swine is likely to serve as a main element in Y. enterocolitica transmission to humans in Brazil, and it could become a potential threat to public health as in Europe.

  1. Dataset on statistical analysis of editorial board composition of Hindawi journals indexed in Emerging sources citation index

    Directory of Open Access Journals (Sweden)

    Hilary I. Okagbue

    2018-04-01

    Full Text Available This data article contains the statistical analysis of the total, percentage and distribution of editorial board composition of 111 Hindawi journals indexed in Emerging Sources Citation Index (ESCI across the continents. The reliability of the data was shown using correlation, goodness-of-fit test, analysis of variance and statistical variability tests. Keywords: Hindawi, Bibliometrics, Data analysis, ESCI, Random, Smart campus, Web of science, Ranking analytics, Statistics

  2. Modernisierung der Steuerung einer EDT-Maschine

    OpenAIRE

    Rohrer, Thomas; Marcuard, Jean-Daniel

    2009-01-01

    Objectif: Par manque de pièces de rechange, la commande d’une machine EDT doit être remplacée chez Novelis. La nouvelle commande logique avec une régulation de position doit être réalisée avec un automate programmable. Les travaux demandés sont la programmation de l’automate et la conception des schémas électriques. Le montage sera réalisé par les Electriciens de Novelis. Dés la fin des travaux de montage, l’installation devra être mise en service. Résultats: Les schémas électriques ont permi...

  3. Design of a setup for {sup 252}Cf neutron source for storage and analysis purpose

    Energy Technology Data Exchange (ETDEWEB)

    Hei, Daqian [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Zhuang, Haocheng [Xi’an Middle School of Shanxi Province, Xi’an 710000 (China); Jia, Wenbao, E-mail: jiawenbao@163.com [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China); Cheng, Can; Jiang, Zhou; Wang, Hongtao [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Chen, Da [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China)

    2016-11-01

    {sup 252}Cf is a reliable isotopic neutron source and widely used in the prompt gamma ray neutron activation analysis (PGNAA) technique. A cylindrical barrel made by polymethyl methacrylate contained with the boric acid solution was designed for storage and application of a 5 μg {sup 252}Cf neutron source. The size of the setup was optimized with Monte Carlo code. The experiments were performed and the results showed the doses were reduced with the setup and less than the allowable limit. The intensity and collimating radius of the neutron beam could also be adjusted through different collimator.

  4. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  5. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    Science.gov (United States)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  6. Application of radionuclide sources for excitation in energy-dispersive X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Hoffmann, P.

    1986-01-01

    X-ray fluorescence (XRF) analysis is in broad application in many fields of science where elemental determinations are necessary. Solid and liquid samples are analyzed by this method. Solids are introduced in thin or thick samples as melted glass, pellets, powders or as original specimen. The excitation of X-ray spectra can be performed by specific and polychromic radiation of X-ray tubes, by protons, deuterons, α-particles, heavy ions and synchrotron radiation from accelerators and by α-particles, X- and γ-rays and by bremsstrahlung generated by β - -particles from radionuclide sources. The radionuclides are devided into groups with respect to their decay mode and the energy of the emitted radiation. The broad application of radionuclides in XRF excitation is shown in examples as semi-quantitative analysis of glasses, as quantitative analysis of coarse ceramics and as quantitative determination of heavy elements (mainly actinides) in solutions. The advantages and disadvantages of radionuclide excitation in XRF analysis are discussed. (orig.) [de

  7. Contents and retentions of free and total purine bases in lamb meat cooked by several household methods

    Directory of Open Access Journals (Sweden)

    P. Anfossi

    2011-03-01

    Full Text Available Concerns about the content of total and free purine bases in muscle foods and their retentions upon cooking have been since long established (Brulé et al., 1988. Recently, though, an important rôle has been acknowledged to dietary sources of preformed purines for the growth of tissues with a rapid turnover and for optimal function of the cellular immune response, up to the point that the positive features of these nutrients seem to outweigh by far the negative ones (ILSI, 1998. Scanty information exists about the total purine content of raw ovine meat, the only available sources of data being a survey by Herbel and Montag (1987 on purine and pyrimidine contents of protein-rich foods and the comprehensive collection of food composition tables compiled by Scherz and Senser (1994...

  8. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  9. The application of x-ray spectrometry to isotopic-source activation analysis of dysprosium and holmium

    International Nuclear Information System (INIS)

    Pillay, A.E.; Mboweni, R.C.M.

    1990-01-01

    A novel aspect of activation analysis is described for the determination of dysprosium and holmium at low concentrations. The method involves the measurement of K x-rays from radionuclides produced by thermal neutron activation using a 1 mg 252 Cf source. The basis for elemental selection depends largely on the demand for analysis and on the existence of favourable nuclear properties for the production of a practicable x-ray yield. A full appraisal of the analytical potential of the method is presented with particular emphasis on its application to geological matrices. The sensitivity was optimised by employing a detector that was particularly effective at photon energies below 150 keV. Analytical conditions are demonstrated for the elements of interest over a wide range of concentrations in small powdered samples. The investigation formed the basis of a feasibility study to establish if the application could be developed for the routine off-line determination of dysprosium and holmium using an isotopic-neutron source. (author)

  10. Neutron activation analysis of essential elements in Multani mitti clay using miniature neutron source reactor

    International Nuclear Information System (INIS)

    Waheed, S.; Rahman, S.; Faiz, Y.; Siddique, N.

    2012-01-01

    Multani mitti clay was studied for 19 essential and other elements. Four different radio-assay schemes were adopted for instrumental neutron activation analysis (INAA) using miniature neutron source reactor. The estimated weekly intakes of Cr and Fe are high for men, women, pregnant and lactating women and children while intake of Co is higher in adult categories and Mn by pregnant women. Comparison of MM clay with other type of clays shows that it is a good source of essential elements. - Highlights: ► Multani mitti clay has been studied for 19 essential elements for human adequacy and safety using INAA and AAS. ► Weekly intakes for different consumer categories have been calculated and compared with DRIs. ► Comparison of MM with other type of clays depict that MM clay is a good source of essential elements.

  11. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  12. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Blind source separation analysis of PET dynamic data: a simple method with exciting MR-PET applications

    Energy Technology Data Exchange (ETDEWEB)

    Oros-Peusquens, Ana-Maria; Silva, Nuno da [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Weiss, Carolin [Department of Neurosurgery, University Hospital Cologne, 50924 Cologne (Germany); Stoffels, Gabrielle; Herzog, Hans; Langen, Karl J [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Shah, N Jon [Institute of Neuroscience and Medicine, Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Jülich-Aachen Research Alliance (JARA) - Section JARA-Brain RWTH Aachen University, 52074 Aachen (Germany)

    2014-07-29

    Denoising of dynamic PET data improves parameter imaging by PET and is gaining momentum. This contribution describes an analysis of dynamic PET data by blind source separation methods and comparison of the results with MR-based brain properties.

  14. Comparative Analysis Study of Open Source GIS in Malaysia

    International Nuclear Information System (INIS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Halim, Mohd Khuizham Abd

    2014-01-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options

  15. Evaluating source separation of plastic waste using conjoint analysis.

    Science.gov (United States)

    Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke

    2008-11-01

    Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues.

  16. Source apportionment of elevated wintertime PAHs by compound-specific radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    R. J. Sheesley

    2009-05-01

    Full Text Available Natural abundance radiocarbon analysis facilitates distinct source apportionment between contemporary biomass/biofuel (14C "alive" versus fossil fuel (14C "dead" combustion. Here, the first compound-specific radiocarbon analysis (CSRA of atmospheric polycyclic aromatic hydrocarbons (PAHs was demonstrated for a set of samples collected in Lycksele, Sweden a small town with frequent episodes of severe atmospheric pollution in the winter. Renewed interest in using residential wood combustion (RWC means that this type of seasonal pollution is of increasing concern in many areas. Five individual/paired PAH isolates from three pooled fortnight-long filter collections were analyzed by CSRA: phenanthrene, fluoranthene, pyrene, benzo[b+k]fluoranthene and indeno[cd]pyrene plus benzo[ghi]perylene; phenanthrene was the only compound also analyzed in the gas phase. The measured Δ14C for PAHs spanned from −138.3‰ to 58.0‰. A simple isotopic mass balance model was applied to estimate the fraction biomass (fbiomass contribution, which was constrained to 71–87% for the individual PAHs. Indeno[cd]pyrene plus benzo[ghi]perylene had an fbiomass of 71%, while fluoranthene and phenanthrene (gas phase had the highest biomass contribution at 87%. The total organic carbon (TOC, defined as carbon remaining after removal of inorganic carbon fbiomass was estimated to be 77%, which falls within the range for PAHs. This CSRA data of atmospheric PAHs established that RWC is the dominating source of atmospheric PAHs to this region of the boreal zone with some variations among RWC contributions to specific PAHs.

  17. The development of an automatic sample-changer and control instrumentation for isotope-source neutron-activation analysis

    International Nuclear Information System (INIS)

    Andeweg, A.H.; Watterson, J.I.W.

    1983-01-01

    An automatic sample-changer was developed at the Council for Mineral Technology for use in isotope-source neutron-activation analysis. Tests show that the sample-changer can transfer a sample of up to 3 kg in mass over a distance of 3 m within 5 s. In addition, instrumentation in the form of a three-stage sequential timer was developed to control the sequence of irradiation transfer and analysis

  18. Analysis of Contract Source Selection Strategy

    Science.gov (United States)

    2015-07-07

    accomplish this milestone due to his unconditional love. I would like to thank my mom, Saraswati, and my dad , Khilendra, for their support and patience...FOR FURTHER RESEARCH The task of understanding the impact of a source selection strategy on resultant contract outcomes is a topic rich for further

  19. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  20. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  1. Identifying sources of emerging organic contaminants in a mixed use watershed using principal components analysis.

    Science.gov (United States)

    Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L

    2014-01-01

    Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.

  2. Source identification of underground fuel spills in a petroleum refinery using fingerprinting techniques and chemo-metric analysis. A Case Study

    International Nuclear Information System (INIS)

    Kanellopoulou, G.; Gidarakos, E.; Pasadakis, N.

    2005-01-01

    Crude oil and its refining products are the most frequent contaminants, found in the environment due to spills. The aim of this work was the identification of spill source(s) in the subsurface of a petroleum refinery. Free phase samples were analyzed with gas chromatography and the analytical results were interpreted using Principal Component Analysis (PCA) method. The chemical analysis of groundwater samples from the refinery subsurface was also employed to obtain a comprehensive picture of the spill distribution and origin. (authors)

  3. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  4. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  5. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  6. Le discours rappporté comme effet de montage du discours citant et du segment citationnel. Contribution à l’étude du discours journalistique

    Directory of Open Access Journals (Sweden)

    Biardzka Elżbieta

    2012-07-01

    règles de la syntaxe (combinatoire contrainte. Les séquences libres et contraintes du DR relevées dans notre corpus sont répertoriées selon les quatre positions principales que peut prendre le DC par rapport à la Cit: DC antéposé (DC+Cit, DC postposé (Cit+ DC, DC intercalé dans la Cit (Cit+DC+Cit et DC encadrant la Cit (DC+Cit+DC. Dans notre communication, nous décrivons des pratiques choisies du DR s'inscrivant dans le montage libre et contraint des types DR=DC+Cit et DR+ Cit+DC.

  7. Off-design performance analysis of organic Rankine cycle using real operation data from a heat source plant

    International Nuclear Information System (INIS)

    Kim, In Seop; Kim, Tong Seop; Lee, Jong Jun

    2017-01-01

    Highlights: • ORC systems driven by waste or residual heat from a combined cycle cogeneration plant were analyzed. • An off-design analysis model was developed and validated with commercial ORC data. • A procedure to predict the actual variation of ORC performance using the off-design model was set up. • The importance of using long-term operation data of the heat source plant was demonstrated. - Abstract: There has been increasing demand for cogeneration power plants, which provides high energy utilization. Research on upgrading power plant performance is also being actively pursued. The organic Rankine cycle (ORC) can operate with mid- and low-temperature heat sources and is suitable for enhancing performance of existing power plants. In this study, an off-design analysis model for the ORC was developed, which is driven by waste heat or residual heat from a combined cycle cogeneration plant. The applied heat sources are the exhaust gas from the heat recovery steam generator (Case 1) and waste heat from a heat storage unit (Case 2). Optimal design points of the ORC were selected based on the design heat source condition of each case. Then, the available ORC power output for each case was predicted using actual long-term plant operation data and a validated off-design analysis model. The ORC capacity of Case 2 was almost two times larger than that of Case 1. The predicted average electricity generation of both cases was less than the design output. The results of this paper reveal the importance of both the prediction of electricity generation using actual plant operation data and the need for optimal ORC system sizing.

  8. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  9. Ion sources for solids isotopic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tyrrell, A. C. [Ministry of Defence, Foulness (UK). Atomic Weapons Research Establishment

    1978-12-15

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material.

  10. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    Science.gov (United States)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work

  11. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo G.M.; Van Der Goot, Erik; Verile, Marco; Wolfart, Erik; Rutan Fowler, Marcy; Feldman, Yana; Hammond, William; Schweighardt, John; Ferguson, Mattew

    2013-01-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  12. Analysis of Peak-to-Peak Current Ripple Amplitude in Seven-Phase PWM Voltage Source Inverters

    Directory of Open Access Journals (Sweden)

    Gabriele Grandi

    2013-08-01

    Full Text Available Multiphase systems are nowadays considered for various industrial applications. Numerous pulse width modulation (PWM schemes for multiphase voltage source inverters with sinusoidal outputs have been developed, but no detailed analysis of the impact of these modulation schemes on the output peak-to-peak current ripple amplitude has been reported. Determination of current ripple in multiphase PWM voltage source inverters is important for both design and control purposes. This paper gives the complete analysis of the peak-to-peak current ripple distribution over a fundamental period for multiphase inverters, with particular reference to seven-phase VSIs. In particular, peak-to-peak current ripple amplitude is analytically determined as a function of the modulation index, and a simplified expression to get its maximum value is carried out. Although reference is made to the centered symmetrical PWM, being the most simple and effective solution to maximize the DC bus utilization, leading to a nearly-optimal modulation to minimize the RMS of the current ripple, the analysis can be readily extended to either discontinuous or asymmetrical modulations, both carrier-based and space vector PWM. A similar approach can be usefully applied to any phase number. The analytical developments for all different sub-cases are verified by numerical simulations.

  13. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  14. Ion sources for solids isotopic analysis

    International Nuclear Information System (INIS)

    Tyrrell, A.C.

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material. (Auth.)

  15. Characterization of polar organic compounds and source analysis of fine organic aerosols in Hong Kong

    Science.gov (United States)

    Li, Yunchun

    Organic aerosols, as an important fraction of airborne particulate mass, significantly affect the environment, climate, and human health. Compared with inorganic species, characterization of individual organic compounds is much less complete and comprehensive because they number in thousands or more and are diverse in chemical structures. The source contributions of organic aerosols are far from being well understood because they can be emitted from a variety of sources as well as formed from photochemical reactions of numerous precursors. This thesis work aims to improve the characterization of polar organic compounds and source apportionment analysis of fine organic carbon (OC) in Hong Kong, which consists of two parts: (1) An improved analytical method to determine monocarboxylic acids, dicarboxylic acids, ketocarboxylic acids, and dicarbonyls collected on filter substrates has been established. These oxygenated compounds were determined as their butyl ester or butyl acetal derivatives using gas chromatography-mass spectrometry. The new method made improvements over the original Kawamura method by eliminating the water extraction and evaporation steps. Aerosol materials were directly mixed with the BF 3/BuOH derivatization agent and the extracting solvent hexane. This modification improves recoveries for both the more volatile and the less water-soluble compounds. This improved method was applied to study the abundances and sources of these oxygenated compounds in PM2.5 aerosol samples collected in Hong Kong under different synoptic conditions during 2003-2005. These compounds account for on average 5.2% of OC (range: 1.4%-13.6%) on a carbon basis. Oxalic acid was the most abundant species. Six C2 and C3 oxygenated compounds, namely oxalic, malonic, glyoxylic, pyruvic acids, glyoxal, and methylglyoxal, dominated this suite of oxygenated compounds. More efforts are therefore suggested to focus on these small compounds in understanding the role of oxygenated

  16. Spallation neutrons pulsed sources

    International Nuclear Information System (INIS)

    Carpenter, J.

    1996-01-01

    This article describes the range of scientific applications which can use these pulsed neutrons sources: Studies on super fluids, measures to verify the crawling model for the polymers diffusion; these sources are also useful to study the neutron disintegration, the ultra cold neutrons. In certain applications which were not accessible by neutrons diffusion, for example, radiations damages, radionuclides production and activation analysis, the spallation sources find their use and their improvement will bring new possibilities. Among others contributions, one must notice the place at disposal of pulsed muons sources and neutrinos sources. (N.C.). 3 figs

  17. Red pepper (Capsicum annuum) carotenoids as a source of natural food colors: analysis and stability-a review.

    Science.gov (United States)

    Arimboor, Ranjith; Natarajan, Ramesh Babu; Menon, K Ramakrishna; Chandrasekhar, Lekshmi P; Moorkoth, Vidya

    2015-03-01

    Carotenoids are increasingly drawing the attention of researchers as a major natural food color due to their inherent nutritional characteristics and the implicated possible role in prevention and protection against degenerative diseases. In this report, we review the role of red pepper as a source for natural carotenoids. The composition of the carotenoids in red pepper and the application of different methodologies for their analysis were discussed in this report. The stability of red pepper carotenoids during post-harvest processing and storage is also reviewed. This review highlights the potential of red pepper carotenoids as a source of natural food colors and also discusses the need for a standardized approach for the analysis and reporting of composition of carotenoids in plant products and designing model systems for stability studies.

  18. TITANIUM ISOTOPE SOURCE RELATIONS AND THE EXTENT OF MIXING IN THE PROTO-SOLAR NEBULA EXAMINED BY INDEPENDENT COMPONENT ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Steele, Robert C. J.; Boehnke, Patrick [Department of Earth, Planetary, and Space Sciences, University of California, Los Angeles, CA 90095 (United States)

    2015-04-01

    The Ti isotope variations observed in hibonites represent some of the largest isotope anomalies observed in the solar system. Titanium isotope compositions have previously been reported for a wide variety of different early solar system materials, including calcium, aluminum rich inclusions (CAIs) and CM hibonite grains, some of the earliest materials to form in the solar system, and bulk meteorites which formed later. These data have the potential to allow mixing of material to be traced between many different regions of the early solar system. We have used independent component analysis to examine the mixing end-members required to produce the compositions observed in the different data sets. The independent component analysis yields results identical to a linear regression for the bulk meteorites. The components identified for hibonite suggest that most of the grains are consistent with binary mixing from one of three highly anomalous nucleosynthetic sources. Comparison of these end-members show that the sources which dominate the variation of compositions in the meteorite parent body forming regions was not present in the region in which the hibonites formed. This suggests that the source which dominates variation in Ti isotope anomalies between the bulk meteorites was not present when the hibonite grains were forming. One explanation is that the bulk meteorite source may not be a primary nucleosynthetic source but was created by mixing two or more of the hibonite sources. Alternatively, the hibonite sources may have been diluted during subsequent nebula processing and are not a dominant solar system signatures.

  19. A novel syngas-fired hybrid heating source for solar-thermal applications: Energy and exergy analysis

    International Nuclear Information System (INIS)

    Pramanik, Santanu; Ravikrishna, R.V.

    2016-01-01

    Highlights: • Biomass-derived syngas as a hybrid energy source for solar thermal power plants. • A novel combustor concept using rich-catalytic and MILD combustion technologies. • Hybrid energy source for a solar-driven supercritical CO 2 -based Brayton cycle. • Comprehensive energetic and exergetic analysis of the combined system. - Abstract: A hybrid heating source using biomass-derived syngas is proposed to enable continuous operation of standalone solar thermal power generation plants. A novel, two-stage, low temperature combustion system is proposed that has the potential to provide stable combustion of syngas with near-zero NO x emissions. The hybrid heating system consists of a downdraft gasifier, a two-stage combustion system, and other auxiliaries. When integrated with a solar cycle, the entire system can be referred to as the integrated gasification solar combined cycle (IGSCC). The supercritical CO 2 Brayton cycle (SCO 2 ) is selected for the solar cycle due to its high efficiency. The thermodynamic performance evaluation of the individual unit and the combined system has been conducted from both energy and exergy considerations. The effect of parameters such as gasification temperature, biomass moisture content, equivalence ratio, and pressure ratio is studied. The efficiency of the IGSCC exhibited a non-monotonic behavior. A maximum thermal efficiency of 36.5% was achieved at an overall equivalence ratio of 0.22 and pressure ratio of 2.75 when the gasifier was operating at T g = 1073 K with biomass containing 20% moisture. The efficiency increased to 40.8% when dry biomass was gasified at a temperature of 973 K. The exergy analysis revealed that the maximum exergy destruction occurred in the gasification system, followed by the combustion system, SCO 2 cycle, and regenerator. The exergy analysis also showed that 8.72% of the total exergy is lost in the exhaust; however, this can be utilized for drying of the biomass.

  20. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  1. APPROCHE POSITIVE DE L'EXTERNALISATION D'ACTIFS : CAS DE LA TITRISATION

    OpenAIRE

    Lejard , Christophe

    2007-01-01

    International audience; les scandales financiers du début des années deux mille ont levé le voile sur un certains nombre de pratiques comptables caractérisées par un manque d'éthique manifeste. Parmi ces pratiques, les montages dits déconsolidants occupèrent une place prépondérante ; ils se matérialisèrent notamment par le recours à diverses opérations financières via un véhicule ad hoc. L'objectif de ce type de montage est l'embellissement de la situation économique de l'entreprise en occult...

  2. Exergy analysis of a two-stage ground source heat pump with a vertical bore for residential space conditioning under simulated occupancy

    International Nuclear Information System (INIS)

    Ally, Moonis R.; Munk, Jeffrey D.; Baxter, Van D.; Gehl, Anthony C.

    2015-01-01

    Highlights: • Exergy and energy analysis of a vertical-bore ground source heat pump over a 12-month period is presented. • The ground provided more than 75% of the heating energy. • Performance metrics are presented. • Sources of systemic inefficiency are identified and prioritized using Exergy analysis. • Understanding performance metrics is vital for judicial use of renewable energy. - Abstract: This twelve-month field study analyzes the performance of a 7.56 W (2.16-ton) water-to-air-ground source heat pump (WA-GSHP) to satisfy domestic space conditioning loads in a 253 m 2 house in a mixed-humid climate in the United States. The practical feasibility of using the ground as a source of renewable energy is clearly demonstrated. Better than 75% of the energy needed for space heating was extracted from the ground. The average monthly electricity consumption for space conditioning was only 40 kW h at summer and winter thermostat set points of 24.4 °C and 21.7 °C, respectively. The WA-GSHP shared the same 94.5 m vertical bore ground loop with a separate water-to-water ground-source heat pump (WW-GSHP) for meeting domestic hot water needs in the same house. Sources of systemic irreversibility, the main cause of lost work, are identified using Exergy and energy analysis. Quantifying the sources of Exergy and energy losses is essential for further systemic improvements. The research findings suggest that the WA-GSHPs are a practical and viable technology to reduce primary energy consumption and greenhouse gas emissions under the IECC 2012 Standard, as well as the European Union (EU) 2020 targets of using renewable energy resources

  3. Identification of sources of heavy metals in the Dutch atmosphere using air filter and lichen analysis

    International Nuclear Information System (INIS)

    de Bruin, M.; Wolterbeek, H.T.

    1984-01-01

    Aerosol samples collected in an industrialized region were analyzed by instrumental neutron activation analysis. Correlation with wind direction and factor analysis were applied to the concentration data to obtain information on the nature and position of the sources. Epiphytic lichens were sampled over the country and analyzed for heavy metals (As, Cd, Sc, Zn, Sb). The data were interpreted by geographically plotting element concentrations and enrichment factors, and by factor analysis. Some pitfalls are discussed which are associated with the use of aerosol and lichen data in studies of heavy metal air pollution. 14 references, 8 figures, 3 tables

  4. A recent source modification for noble gases at the Los Alamos on-line mass analysis facility

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Forman, L.

    1976-01-01

    The Los Alamos on-line mass analysis experiment at the Godiva-IV burst reactor facility has been modified to determine independent fission yields of noble gases. The gases are released from a stearate target and ionization by electron bombardment. The distance traveled by the gases from the target to the ionization chamber is 20 cm. The efficiency of the electron bombardment source is lower than that of the surface ionization source that was employed to measure the yields of Rb and Cs. But this effect is compensated by the larger quantity of target metal that is possible when using a stearate target. (Auth.)

  5. Visualization of NO2 emission sources using temporal and spatial pattern analysis in Asia

    Science.gov (United States)

    Schütt, A. M. N.; Kuhlmann, G.; Zhu, Y.; Lipkowitsch, I.; Wenig, M.

    2016-12-01

    Nitrogen dioxide (NO2) is an indicator for population density and level of development, but the contributions of the different emission sources to the overall concentrations remains mostly unknown. In order to allocate fractions of OMI NO2 to emission types, we investigate several temporal cycles and regional patterns.Our analysis is based on daily maps of tropospheric NO2 vertical column densities (VCDs) from the Ozone Monitoring Instrument (OMI). The data set is mapped to a high resolution grid by a histopolation algorithm. This algorithm is based on a continuous parabolic spline, producing more realistic smooth distributions while reproducing the measured OMI values when integrating over ground pixel areas.In the resulting sequence of zoom in maps, we analyze weekly and annual cycles for cities, countryside and highways in China, Japan and Korea Republic and look for patterns and trends and compare the derived results to emission sources in Middle Europe and North America. Due to increased heating in winter compared to summer and more traffic during the week than on Sundays, we dissociate traffic, heating and power plants and visualized maps with different sources. We will also look into the influence of emission control measures during big events like the Olympic Games 2008 and the World Expo 2010 as a possibility to confirm our classification of NO2 emission sources.

  6. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    Science.gov (United States)

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  7. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  8. Speciation of heavy metals in different grain sizes of Jiaozhou Bay sediments: Bioavailability, ecological risk assessment and source analysis on a centennial timescale.

    Science.gov (United States)

    Kang, Xuming; Song, Jinming; Yuan, Huamao; Duan, Liqin; Li, Xuegang; Li, Ning; Liang, Xianmeng; Qu, Baoxiao

    2017-09-01

    Heavy metal contamination is an essential indicator of environmental health. In this work, one sediment core was used for the analysis of the speciation of heavy metals (Cr, Mn, Ni, Cu, Zn, As, Cd, and Pb) in Jiaozhou Bay sediments with different grain sizes. The bioavailability, sources and ecological risk of heavy metals were also assessed on a centennial timescale. Heavy metals were enriched in grain sizes of Pb > Cd > Zn > Cu >Ni > Cr > As. Enrichment factors (EF) indicated that heavy metals in Jiaozhou Bay presented from no enrichment to minor enrichment. The potential ecological risk index (RI) indicated that Jiaozhou Bay had been suffering from a low ecological risk and presented an increasing trend since 1940s owing to the increase of anthropogenic activities. The source analysis indicated that natural sources were primary sources of heavy metals in Jiaozhou Bay and anthropogenic sources of heavy metals presented an increasing trend since 1940s. The principal component analysis (PCA) indicated that Cr, Mn, Ni, Cu and Pb were primarily derived from natural sources and that Zn and Cd were influenced by shipbuilding industry. Mn, Cu, Zn and Pb may originate from both natural and anthropogenic sources. As may be influenced by agricultural activities. Moreover, heavy metals in sediments of Jiaozhou Bay were clearly influenced by atmospheric deposition and river input. Copyright © 2017. Published by Elsevier Inc.

  9. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  10. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  11. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  13. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    Science.gov (United States)

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  14. Source attribution of Bornean air masses by back trajectory analysis during the OP3 project

    Directory of Open Access Journals (Sweden)

    N. H. Robinson

    2011-09-01

    Full Text Available Atmospheric composition affects the radiative balance of the Earth through the creation of greenhouse gases and the formation of aerosols. The latter interact with incoming solar radiation, both directly and indirectly through their effects on cloud formation and lifetime. The tropics have a major influence on incoming sunlight however the tropical atmosphere is poorly characterised, especially outside Amazonia. The origins of air masses influencing a measurement site in a protected rainforest in Borneo, South East Asia, were assessed and the likely sources of a range of trace gases and particles were determined. This was conducted by interpreting in situ measurements made at the site in the context of ECMWF backwards air mass trajectories. Two different but complementary methods were employed to interpret the data: comparison of periods classified by cluster analysis of trajectories, and inspection of the dependence of mean measured values on geographical history of trajectories. Sources of aerosol particles, carbon monoxide and halocarbons were assessed. The likely source influences include: terrestrial organic biogenic emissions; long range transport of anthropogenic emissions; biomass burning; sulphurous emissions from marine phytoplankton, with a possible contribution from volcanoes; marine production of inorganic mineral aerosol; and marine production of halocarbons. Aerosol sub- and super-saturated water affinity was found to be dependent on source (and therefore composition, with more hygroscopic aerosol and higher numbers of cloud condensation nuclei measured in air masses of marine origin. The prevailing sector during the majority of measurements was south-easterly, which is from the direction of the coast closest to the site, with a significant influence inland from the south-west. This analysis shows that marine and terrestrial air masses have different dominant chemical sources. Comparison with the AMAZE-08 project in the Amazon

  15. BioXTAS RAW: improvements to a free open-source program for small-angle X-ray scattering data reduction and analysis.

    Science.gov (United States)

    Hopkins, Jesse Bennett; Gillilan, Richard E; Skou, Soren

    2017-10-01

    BioXTAS RAW is a graphical-user-interface-based free open-source Python program for reduction and analysis of small-angle X-ray solution scattering (SAXS) data. The software is designed for biological SAXS data and enables creation and plotting of one-dimensional scattering profiles from two-dimensional detector images, standard data operations such as averaging and subtraction and analysis of radius of gyration and molecular weight, and advanced analysis such as calculation of inverse Fourier transforms and envelopes. It also allows easy processing of inline size-exclusion chromatography coupled SAXS data and data deconvolution using the evolving factor analysis method. It provides an alternative to closed-source programs such as Primus and ScÅtter for primary data analysis. Because it can calibrate, mask and integrate images it also provides an alternative to synchrotron beamline pipelines that scientists can install on their own computers and use both at home and at the beamline.

  16. Preparation of a Co-57 Moessbauer source for applications in analysis of iron compounds

    International Nuclear Information System (INIS)

    Gonzalez-Ramirez, R.

    1990-01-01

    A report is presented on the preparation of a 57-Co low activity mossbauer source in a stainless steel matrix, which may be used for both demonstration experiments and some simple analysis work. This kind of sources are available in the market and there is not a general agreement on the particular conditions of preparations. Three series of experiments were performed to find out the best conditions to electro deposit 59, 60, 57-Co respectively on a stainless steel foil 25 μM thick and 1 Cm 2 area. The electrolyte contained Co(NO 3 ) 2 in a buffer solution to control the ph in the range 8.5 2 . Once the best conditions to electrodeposit 57-Co were found, it was diffused into the stainless steel matrix by annealing at 1100 o C for three hours and then gradually cooled down to room temperature in two hours; all this was done in the presence of an argon flow. Lastly, a 15 μCi 57-Co Mossbauer source in stainless steel matrix was obtained and used to record a series of Mossbauer parameters of this spectra were in close agreement with those given in the literature. (Author)

  17. ELATE: an open-source online application for analysis and visualization of elastic tensors

    International Nuclear Information System (INIS)

    Gaillac, Romain; Coudert, François-Xavier; Pullumbi, Pluton

    2016-01-01

    We report on the implementation of a tool for the analysis of second-order elastic stiffness tensors, provided with both an open-source Python module and a standalone online application allowing the visualization of anisotropic mechanical properties. After describing the software features, how we compute the conventional elastic constants and how we represent them graphically, we explain our technical choices for the implementation. In particular, we focus on why a Python module is used to generate the HTML web page with embedded Javascript for dynamical plots. (paper)

  18. Preliminary thermal analysis of grids for twin source extraction system

    International Nuclear Information System (INIS)

    Pandey, Ravi; Bandyopadhyay, Mainak; Chakraborty, Arun K.

    2017-01-01

    The TWIN (Two driver based Indigenously built Negative ion source) source provides a bridge between the operational single driver based negative ion source test facility, ROBIN in IPR and an ITER-type multi driver based ion source. The source is designed to be operated in CW mode with 180kW, 1MHz, 5s ON/600s OFF duty cycle and also in 5Hz modulation mode with 3s ON/20s OFF duty cycle for 3 such cycle. TWIN source comprises of ion source sub-assembly (consist of driver and plasma box) and extraction system sub-assembly. Extraction system consists of Plasma grid (PG), extraction grid (EG) and Ground grid (GG) sub assembly. Negative ion beams produced at plasma grid seeing the plasma side of ion source will receive moderate heat flux whereas the extraction grid and ground grid would be receiving majority of heat flux from extracted negative ion and co-extracted electron beams. Entire Co-extracted electron beam would be dumped at extraction grid via electron deflection magnetic field making the requirement of thermal and hydraulic design for extraction grid to be critical. All the three grids are made of OFHC Copper and would be actively water cooled keeping the peak temperature rise of grid surface within allowable limit with optimum uniformity. All the grids are to be made by vacuum brazing process where joint strength becomes crucial at elevated temperature. Hydraulic design must maintain the peak temperature at the brazing joint within acceptable limit

  19. Source apportionment analysis of atmospheric particulates in an industrialised urban site in southwestern Spain

    International Nuclear Information System (INIS)

    Querol, X.; Alastuey, A.; Sanchez-de-la-Campa, A.; Plana, F.; Ruiz, C.R.; Rosa, J. de la

    2002-01-01

    A detailed physical and chemical characterisation of total suspended particles (TSP) in the highly industrialised city of Huelva (southwestern Spain) was carried out. The results evidenced a coarse grain-size prevalence (PM 10 accounting for only 40% of TSP mass, 37 and 91 μg/m 3 , respectively). PM 10 levels are in the usual range for urban background sites in Spain. The crustal, anthropogenic and marine components accounted for a mean of a 40%, 24% and 5% of bulk TSP, respectively. As expected from the industrial activities, relatively high PO 4 3- and As levels for an urban site were detected. In addition to the crustal and marine components, source apportionment analysis revealed three additional emission sources influencing the levels and composition of TSP: (a) a petrochemical source, (b) a mixed metallurgical-phosphate source, (c) and an unknown source (Sb and NO 3 - ). Due to the high local emissions, the mean TSP anthropogenic contribution (mostly PM 10 ) obtained for all possible air mass transport scenarios reached 18-29 μg/m 3 . The 2010 annual EU PM 10 limit value (20 μg/m 3 ) would be exceeded by the anthropogenic load recorded for all the air mass transport scenarios, with the exception of the North Atlantic transport (only 15% of the sampling days). Under African air mass transport scenarios (20% of sampling days), the TSP crustal contribution reached near three times the local crustal contribution. It must be pointed out that this crustal input should diminish when sampling PM 10 due to the dominant coarse size distribution of this type of particles. (author)

  20. Major models and data sources for residential and commercial sector energy conservation analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-09-01

    Major models and data sources are reviewed that can be used for energy-conservation analysis in the residential and commercial sectors to provide an introduction to the information that can or is available to DOE in order to further its efforts in analyzing and quantifying their policy and program requirements. Models and data sources examined in the residential sector are: ORNL Residential Energy Model; BECOM; NEPOOL; MATH/CHRDS; NIECS; Energy Consumption Data Base: Household Sector; Patterns of Energy Use by Electrical Appliances Data Base; Annual Housing Survey; 1970 Census of Housing; AIA Research Corporation Data Base; RECS; Solar Market Development Model; and ORNL Buildings Energy Use Data Book. Models and data sources examined in the commercial sector are: ORNL Commercial Sector Model of Energy Demand; BECOM; NEPOOL; Energy Consumption Data Base: Commercial Sector; F.W. Dodge Data Base; NFIB Energy Report for Small Businesses; ADL Commercial Sector Energy Use Data Base; AIA Research Corporation Data Base; Nonresidential Buildings Surveys of Energy Consumption; General Electric Co: Commercial Sector Data Base; The BOMA Commercial Sector Data Base; The Tishman-Syska and Hennessy Data Base; The NEMA Commercial Sector Data Base; ORNL Buildings Energy Use Data Book; and Solar Market Development Model. Purpose; basis for model structure; policy variables and parameters; level of regional, sectoral, and fuels detail; outputs; input requirements; sources of data; computer accessibility and requirements; and a bibliography are provided for each model and data source.

  1. A portable measurement system for subcriticality measurements by the Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Ragan, G.E.; Blakeman, E.D.

    1987-01-01

    A portable measurement system consisting of a personal computer used as a Fourier analyzer and three detection channels (with associated electronics that provide the signals to analog-to-digital (A/D) convertors) has been assembled to measure subcriticality by the 252 Cf-source-driven neutron noise analysis method. 8 refs

  2. CHANDRA ACIS SURVEY OF X-RAY POINT SOURCES: THE SOURCE CATALOG

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Song; Liu, Jifeng; Qiu, Yanli; Bai, Yu; Yang, Huiqin; Guo, Jincheng; Zhang, Peng, E-mail: jfliu@bao.ac.cn, E-mail: songw@bao.ac.cn [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China)

    2016-06-01

    The Chandra archival data is a valuable resource for various studies on different X-ray astronomy topics. In this paper, we utilize this wealth of information and present a uniformly processed data set, which can be used to address a wide range of scientific questions. The data analysis procedures are applied to 10,029 Advanced CCD Imaging Spectrometer observations, which produces 363,530 source detections belonging to 217,828 distinct X-ray sources. This number is twice the size of the Chandra Source Catalog (Version 1.1). The catalogs in this paper provide abundant estimates of the detected X-ray source properties, including source positions, counts, colors, fluxes, luminosities, variability statistics, etc. Cross-correlation of these objects with galaxies shows that 17,828 sources are located within the D {sub 25} isophotes of 1110 galaxies, and 7504 sources are located between the D {sub 25} and 2 D {sub 25} isophotes of 910 galaxies. Contamination analysis with the log N –log S relation indicates that 51.3% of objects within 2 D {sub 25} isophotes are truly relevant to galaxies, and the “net” source fraction increases to 58.9%, 67.3%, and 69.1% for sources with luminosities above 10{sup 37}, 10{sup 38}, and 10{sup 39} erg s{sup −1}, respectively. Among the possible scientific uses of this catalog, we discuss the possibility of studying intra-observation variability, inter-observation variability, and supersoft sources (SSSs). About 17,092 detected sources above 10 counts are classified as variable in individual observation with the Kolmogorov–Smirnov (K–S) criterion ( P {sub K–S} < 0.01). There are 99,647 sources observed more than once and 11,843 sources observed 10 times or more, offering us a wealth of data with which to explore the long-term variability. There are 1638 individual objects (∼2350 detections) classified as SSSs. As a quite interesting subclass, detailed studies on X-ray spectra and optical spectroscopic follow-up are needed to

  3. Vibration analysis of the photon shutter designed for the advanced photon source

    International Nuclear Information System (INIS)

    Wang, Z.; Shu, D.; Kuzay, T.M.

    1992-01-01

    The photon shutter is a critical component of the beamline front end for the 7 GeV Advanced Photon Source (APS) project, now under construction at Argonne National Laboratory (ANL). The shutter is designed to close in tens of milliseconds to absorb up to 10 kW heat load (with high heat flux). Our shutter design uses innovative enhanced heat transfer tubes to withstand the high heat load. Although designed to be light weight and compact, the very fast movement of the shutter gives rise to concern regarding vibration and dynamic sensitivity. To guarantee long-term functionality and reliability of the shutter, the dynamic behavior should be fully studied. In this paper, the natural frequency and transient dynamic analysis for the shutter during operation are presented. Through analysis of the vibration characteristics, as well as stress and deformation, several options in design were developed and compared, including selection of materials for the shutter and structural details

  4. Physical activity and social support in adolescents: analysis of different types and sources of social support.

    Science.gov (United States)

    Mendonça, Gerfeson; Júnior, José Cazuza de Farias

    2015-01-01

    Little is known about the influence of different types and sources of social support on physical activity in adolescents. The aim of this study was to analyse the association between physical activity and different types and sources of social support in adolescents. The sample consisted of 2,859 adolescents between 14-19 years of age in the city of João Pessoa, in Northeastern Brazil. Physical activity was measured with a questionnaire and social support from parents and friends using a 10-item scale five for each group (type of support: encouragement, joint participation, watching, inviting, positive comments and transportation). Multivariable analysis showed that the types of support provided by parents associated with physical activity in adolescents were encouragement for females (P genders (males: P = 0.009; females: P physical activity varies according to its source, as well as the gender and age of the adolescents.

  5. Analysis of the emission characteristics of ion sources for high-value optical counting processes

    International Nuclear Information System (INIS)

    Beermann, Nils

    2009-01-01

    The production of complex high-quality thin film systems requires a detailed understanding of all partial processes. One of the most relevant partial processes is the condensation of the coating material on the substrate surface. The optical and mechanical material properties can be adjusted by the well-defined impingement of energetic ions during deposition. Thus, in the past, a variety of different ion sources were developed. With respect to the present and future challenges in the production of precisely fabricated high performance optical coatings, the ion emission of the sources has commonly not been characterized sufficiently so far. This question is addressed in the frame of this work which itself is thematically integrated in the field of process-development and -control of ion assisted deposition processes. In a first step, a Faraday cup measurement system was developed which allows the spatially resolved determination of the ion energy distribution as well as the ion current distribution. Subsequently, the ion emission profiles of six ion sources were determined depending on the relevant operating parameters. Consequently, a data pool for process planning and supplementary process analysis is made available. On the basis of the acquired results, the basic correlations between the operating parameters and the ion emission are demonstrated. The specific properties of the individual sources as well as the respective control strategies are pointed out with regard to the thin film properties and production yield. Finally, a synthesis of the results and perspectives for future activities are given. (orig.)

  6. Kõik on koreograafia / Daniel Linehan ; intervjueerinud Liina Luhats

    Index Scriptorium Estoniae

    Linehan, Daniel

    2011-01-01

    Intervjuu 12. rahvusvahelise Augusti tantsufestivali raames 21. augustil Kanuti Gildi saalis edenduvate lavastuste "Montage for Three" ja "Not About Everything" koreograafi ja tantsija Daniel Linehaniga

  7. Petrographic Analysis and Geochemical Source Correlation of Pigeon Peak, Sutter Buttes, CA

    Science.gov (United States)

    Novotny, N. M.; Hausback, B. P.

    2013-12-01

    The Sutter Buttes are a volcanic complex located in the center of the Great Valley north of Sacramento. They are comprised of numerous inter-intruding andesite and rhyolite lava domes of varying compositions surrounded by a shallow rampart of associated tephras. The Pigeon Peak block-and-ash flow sequence is located in the rampart and made up of a porphyritic Biotite bearing Hornblende Andesite. The andesite blocks demonstrate a high degree of propylization in hornblende crystals, highly zoned plagioclase, trace olivine, and display a red to gray color gradation. DAR is an andesite dome located less than one mile from Pigeon Peak. Of the 15 to 25 andesite lava domes within four miles from Pigeon Peak, only DAR displays trace olivine, red to grey color stratification, low biotite content, and propylitized hornblende. These characteristic similarities suggest that DAR may be the source for Pigeon Peak. My investigation used microprobe analysis of the DAR and Pigeon Peak feldspar crystals to identify the magmatic history of the magma body before emplacement. Correlation of the anorthite zoning within the feldspars from both locations support my hypothesis that DAR is the source of the Pigeon Peak block-and-ash flow.

  8. Experimental analysis of a diffusion absorption refrigeration system used alternative energy sources

    International Nuclear Information System (INIS)

    Soezen, A.; Oezbas, E.

    2009-01-01

    The continuous-cycle absorption refrigeration device is widely used in domestic refrigerators, and recreational vehicles. It is also used in year-around air conditioning of both homes and larger buildings. The unit consists of four main parts the boiler, condenser, evaporator and the absorber. When the unit operates on kerosene or gas, the heat is supplied by a burner. This element is fitted underneath the central tube. When operating on electricity, the heat is supplied by an element inserted in the pocket. No moving parts are employed. The operation of the refrigerating mechanism is based on Dalton's law. In this study, experimental analysis was performed of a diffusion absorption refrigeration system (DARS) used alternative energy sources such as solar, liquid petroleum gas (LPG) sources. Two basic DAR cycles were set up and investigated: i) In the first cycle (DARS-1), the condensate is sub-cooled prior to the evaporator entrance by the coupled evaporator/gas heat exchanger similar with manufactured by Electrolux Sweden. ii) In the second cycle (DARS-2), the condensate is not sub-cooled prior to the evaporator entrance and gas heat exchanger is separated from the evaporator. (author)

  9. Derivation and analysis of the Feynman-alpha formula for deterministically pulsed sources

    International Nuclear Information System (INIS)

    Wright, J.; Pazsit, I.

    2004-03-01

    The purpose or this report is to give a detailed description of the calculation of the Feynman-alpha formula with deterministically pulsed sources. In contrast to previous calculations, Laplace transform and complex function methods are used to arrive at a compact solution in form of a Fourier series-like expansion. The advantage of this method is that it is capable to treat various pulse shapes. In particular, in addition to square- and Dirac delta pulses, a more realistic Gauss-shaped pulse is also considered here. The final solution of the modified variance-to-mean, that is the Feynman Y(t) function, can be quantitatively evaluated fast and with little computational effort. The analytical solutions obtained are then analysed quantitatively. The behaviour of the number or neutrons in the system is investigated in detail, together with the transient that follows the switching on of the source. An analysis of the behaviour of the Feynman Y(t) function was made with respect to the pulse width and repetition frequency. Lastly, the possibility of using me formulae for the extraction of the parameter alpha from a simulated measurement is also investigated

  10. Unique effects and moderators of effects of sources on self-efficacy: A model-based meta-analysis.

    Science.gov (United States)

    Byars-Winston, Angela; Diestelmann, Jacob; Savoy, Julia N; Hoyt, William T

    2017-11-01

    Self-efficacy beliefs are strong predictors of academic pursuits, performance, and persistence, and in theory are developed and maintained by 4 classes of experiences Bandura (1986) referred to as sources: performance accomplishments (PA), vicarious learning (VL), social persuasion (SP), and affective arousal (AA). The effects of sources on self-efficacy vary by performance domain and individual difference factors. In this meta-analysis (k = 61 studies of academic self-efficacy; N = 8,965), we employed B. J. Becker's (2009) model-based approach to examine cumulative effects of the sources as a set and unique effects of each source, controlling for the others. Following Becker's recommendations, we used available data to create a correlation matrix for the 4 sources and self-efficacy, then used these meta-analytically derived correlations to test our path model. We further examined moderation of these associations by subject area (STEM vs. non-STEM), grade, sex, and ethnicity. PA showed by far the strongest unique association with self-efficacy beliefs. Subject area was a significant moderator, with sources collectively predicting self-efficacy more strongly in non-STEM (k = 14) compared with STEM (k = 47) subjects (R2 = .37 and .22, respectively). Within studies of STEM subjects, grade level was a significant moderator of the coefficients in our path model, as were 2 continuous study characteristics (percent non-White and percent female). Practical implications of the findings and future research directions are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Quantification of source impact to PM using three-dimensional weighted factor model analysis on multi-site data

    Science.gov (United States)

    Shi, Guoliang; Peng, Xing; Huangfu, Yanqi; Wang, Wei; Xu, Jiao; Tian, Yingze; Feng, Yinchang; Ivey, Cesunica E.; Russell, Armistead G.

    2017-07-01

    Source apportionment technologies are used to understand the impacts of important sources of particulate matter (PM) air quality, and are widely used for both scientific studies and air quality management. Generally, receptor models apportion speciated PM data from a single sampling site. With the development of large scale monitoring networks, PM speciation are observed at multiple sites in an urban area. For these situations, the models should account for three factors, or dimensions, of the PM, including the chemical species concentrations, sampling periods and sampling site information, suggesting the potential power of a three-dimensional source apportionment approach. However, the principle of three-dimensional Parallel Factor Analysis (Ordinary PARAFAC) model does not always work well in real environmental situations for multi-site receptor datasets. In this work, a new three-way receptor model, called "multi-site three way factor analysis" model is proposed to deal with the multi-site receptor datasets. Synthetic datasets were developed and introduced into the new model to test its performance. Average absolute error (AAE, between estimated and true contributions) for extracted sources were all less than 50%. Additionally, three-dimensional ambient datasets from a Chinese mega-city, Chengdu, were analyzed using this new model to assess the application. Four factors are extracted by the multi-site WFA3 model: secondary source have the highest contributions (64.73 and 56.24 μg/m3), followed by vehicular exhaust (30.13 and 33.60 μg/m3), crustal dust (26.12 and 29.99 μg/m3) and coal combustion (10.73 and 14.83 μg/m3). The model was also compared to PMF, with general agreement, though PMF suggested a lower crustal contribution.

  12. Bias in calculated keff from subcritical measurements by the 252Cf-source-driven noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; Valentine, T.E.

    1995-01-01

    The development of MCNP-DSP, which allows direct calculation of the measured time and frequency analysis parameters from subcritical measurements using the 252 Cf-source-driven noise analysis method, permits the validation of calculational methods for criticality safety with in-plant subcritical measurements. In addition, a method of obtaining the bias in the calculations, which is essential to the criticality safety specialist, is illustrated using the results of measurements with 17.771-cm-diam, enriched (93.15), unreflected, and unmoderated uranium metal cylinders. For these uranium metal cylinders the bias obtained using MCNP-DSP and ENDF/B-V cross-section data increased with subcriticality. For a critical experiment [height (h) = 12.629 cm], it was -0.0061 ± 0.0003. For a 10.16-cm-high cylinder (k ∼ 0.93), it was 0.0060 ± 0.0016, and for a subcritical cylinder (h = 8.13 cm, k ∼ 0.85), the bias was -0.0137 ± 0.0037, more than a factor of 2 larger in magnitude. This method allows the nuclear criticality safety specialist to establish the bias in calculational methods for criticality safety from in-plant subcritical measurements by the 252 Cf-source-driven noise analysis method

  13. Developing a source-receptor methodology for the characterization of VOC sources in ambient air

    International Nuclear Information System (INIS)

    Borbon, A.; Badol, C.; Locoge, N.

    2005-01-01

    Since 2001, in France, a continuous monitoring of about thirty ozone precursor non-methane hydrocarbons (NMHC) is led in some urban areas. The automated system for NMHC monitoring consists of sub-ambient preconcentration on a cooled multi-sorbent trap followed by thermal desorption and bidimensional Gas Chromatography/Flame Ionisation Detection analysis.The great number of data collected and their exploitation should provide a qualitative and quantitative assessment of hydrocarbon sources. This should help in the definition of relevant strategies of emission regulation as stated by the European Directive relative to ozone in ambient air (2002/3/EC). The purpose of this work is to present the bases and the contributions of an original methodology known as source-receptor in the characterization of NMHC sources. It is a statistical and diagnostic approach, adaptable and transposable in all urban sites, which integrates the spatial and temporal dynamics of the emissions. The methods for source identification combine descriptive or more complex complementary approaches: 1) univariate approach through the analysis of NMHC time series and concentration roses, 2) bivariate approach through a Graphical Ratio Analysis and a characterization of scatterplot distributions of hydrocarbon pairs, 3) multivariate approach with Principal Component Analyses on various time basis. A linear regression model is finally developed to estimate the spatial and temporal source contributions. Apart from vehicle exhaust emissions, sources of interest are: combustion and fossil fuel-related activities, petrol and/or solvent evaporation, the double anthropogenic and biogenic origin of isoprene and other industrial activities depending on local parameters. (author)

  14. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Physics of the 252Cf-source-driven noise analysis measurement

    International Nuclear Information System (INIS)

    Valentine, T.E.; Mihalczo, J.T.; Perez, R.B.; Mattingly, J.K.

    1997-01-01

    The 252 Cf-source-driven noise analysis method is a versatile measurements tool that has been applied to measurements for initial loading of reactors, quality assurance of reactor fuel elements, fuel processing facilities, fuel reprocessing facilities, fuel storage facilities, zero-power testing of reactors, verification of calculational methods, process monitoring, characterization of storage vaults, and nuclear weapons identification. This method's broad range of application is due to the wide variety of time- and frequency domain signatures, each with unique properties, obtained from the measurement. The following parameters are obtained from this measurement: average detector count rates, detector multiplicities, detector autocorrelations, cross-correlation between detectors, detector autopower spectral densities, cross-power spectral densities between detectors, coherences, and ratios of spectral densities. All of these measured parameters can also be calculated using the MCNP-DSP Monte Carlo code. This paper presents a review of the time-domain signatures obtained from this measurement

  16. The impacts of source structure on geodetic parameters demonstrated by the radio source 3C371

    Science.gov (United States)

    Xu, Ming H.; Heinkelmann, Robert; Anderson, James M.; Mora-Diaz, Julian; Karbon, Maria; Schuh, Harald; Wang, Guang L.

    2017-07-01

    Closure quantities measured by very-long-baseline interferometry (VLBI) observations are independent of instrumental and propagation instabilities and antenna gain factors, but are sensitive to source structure. A new method is proposed to calculate a structure index based on the median values of closure quantities rather than the brightness distribution of a source. The results are comparable to structure indices based on imaging observations at other epochs and demonstrate the flexibility of deriving structure indices from exactly the same observations as used for geodetic analysis and without imaging analysis. A three-component model for the structure of source 3C371 is developed by model-fitting closure phases. It provides a real case of tracing how the structure effect identified by closure phases in the same observations as the delay observables affects the geodetic analysis, and investigating which geodetic parameters are corrupted to what extent by the structure effect. Using the resulting structure correction based on the three-component model of source 3C371, two solutions, with and without correcting the structure effect, are made. With corrections, the overall rms of this source is reduced by 1 ps, and the impacts of the structure effect introduced by this single source are up to 1.4 mm on station positions and up to 4.4 microarcseconds on Earth orientation parameters. This study is considered as a starting point for handling the source structure effect on geodetic VLBI from geodetic sessions themselves.

  17. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  18. Categorization of radiation sources

    International Nuclear Information System (INIS)

    Antonova, M.

    2000-01-01

    Through one-parameter (factor) analysis it is proved a hypothesis that the value of a radiation source (RS) activity of an application correlates with the category (the rank) given to it by the IAEA categorization although it is based on other parameters of the RS applications (practices like devices with radiation sources in industry, science, medicine and agriculture). The principles of the new IAEA categorization, taking into account the potential harm the sources may cause and the necessary regulatory control, are described. (author)

  19. HFIR cold neutron source moderator vessel design analysis

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-04-01

    A cold neutron source capsule made of aluminum alloy is to be installed and located at the tip of one of the neutron beam tubes of the High Flux Isotope Reactor. Cold hydrogen liquid of temperature approximately 20 degree Kelvin and 15 bars pressure is designed to flow through the aluminum capsule that serves to chill and to moderate the incoming neutrons produced from the reactor core. The cold and low energy neutrons thus produced will be used as cold neutron sources for the diffraction experiments. The structural design calculation for the aluminum capsule is reported in this paper

  20. Entrepreneurship as re-sourcing

    DEFF Research Database (Denmark)

    Korsgaard, Steffen; Anderson, Alistair; Gaddefors, Johan

    Objectives The purpose of this paper is to re-examine the concept of entrepreneurship in light of the current financial and environmental crisis and its socio-spatial impact. Building on Hudson’s analysis of production in late-capitalist societies, we identify problems inherent in the dominant...... of grounding in material reality, lacking emphasis on environmental externalities and an impoverished conceptualization of spatial relations. Comparing this analysis with the dominant opportunistic image of the entrepreneur, leads us to formulate a critique of this image. In formulating an alternative we build...... The paper presents a “new image” of entrepreneurship as re-sourcing. The concept of re-sourcing emphasizes the dual meaning of the word resource as both a stock of supply and strategy or action adopted in adverse circumstances. Re-sourcing thus signifies a shift in focus from opportunities to resources...

  1. SWOT Analysis and Related Countermeasures for Croatia to Explore the Chinese Tourist Source Market

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2017-08-01

    Full Text Available Croatia is a land endowed with rich and diversified natural and cultural tourist resources. Traveling around Croatia, I was stunned by its beauty. However, I noticed that there were few Chinese tourists in Croatia. How can we bring more Chinese tourists to Croatia? How can we make them happy and comfortable in Croatia? And, at the same time, how can we avoid polluting this tract of pure land? Based on first-hand research work, I make a SWOT analysis of the Chinese tourist source market of Croatia and put forward related countermeasures from the perspective of a native Chinese. The positioning of tourism in Croatia should be ingeniously packaged. I recommend developing diversified and specialized tourist products, various marketing and promotional activities, simple and flexible visa policies and regulations, and other related measures to further explore the Chinese tourist source market of Croatia.

  2. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  3. Rational manipulation of digital EEG: pearls and pitfalls.

    Science.gov (United States)

    Seneviratne, Udaya

    2014-12-01

    The advent of digital EEG has provided greater flexibility and more opportunities in data analysis to optimize the diagnostic yield. Changing the filter settings, sensitivity, montages, and time-base are possible rational manipulations to achieve this goal. The options to use polygraphy, video, and quantification are additional useful features. Aliasing and loss of data are potential pitfalls in the use of digital EEG. This review illustrates some common clinical scenarios where rational manipulations can enhance the diagnostic EEG yield and potential pitfalls in the process.

  4. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    International Nuclear Information System (INIS)

    Kerns, J; Yaldo, D

    2016-01-01

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the time of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.

  5. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Kerns, J [UT MD Anderson Cancer Center, Houston, TX (United States); Yaldo, D [Advocate Health Care, Park Ridge, IL (United States)

    2016-06-15

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the time of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.

  6. Pseudo-dynamic source modelling with 1-point and 2-point statistics of earthquake source parameters

    KAUST Repository

    Song, S. G.

    2013-12-24

    Ground motion prediction is an essential element in seismic hazard and risk analysis. Empirical ground motion prediction approaches have been widely used in the community, but efficient simulation-based ground motion prediction methods are needed to complement empirical approaches, especially in the regions with limited data constraints. Recently, dynamic rupture modelling has been successfully adopted in physics-based source and ground motion modelling, but it is still computationally demanding and many input parameters are not well constrained by observational data. Pseudo-dynamic source modelling keeps the form of kinematic modelling with its computational efficiency, but also tries to emulate the physics of source process. In this paper, we develop a statistical framework that governs the finite-fault rupture process with 1-point and 2-point statistics of source parameters in order to quantify the variability of finite source models for future scenario events. We test this method by extracting 1-point and 2-point statistics from dynamically derived source models and simulating a number of rupture scenarios, given target 1-point and 2-point statistics. We propose a new rupture model generator for stochastic source modelling with the covariance matrix constructed from target 2-point statistics, that is, auto- and cross-correlations. Our sensitivity analysis of near-source ground motions to 1-point and 2-point statistics of source parameters provides insights into relations between statistical rupture properties and ground motions. We observe that larger standard deviation and stronger correlation produce stronger peak ground motions in general. The proposed new source modelling approach will contribute to understanding the effect of earthquake source on near-source ground motion characteristics in a more quantitative and systematic way.

  7. Neutron sources and applications

    Energy Technology Data Exchange (ETDEWEB)

    Price, D.L. [ed.] [Argonne National Lab., IL (United States); Rush, J.J. [ed.] [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

    1994-01-01

    Review of Neutron Sources and Applications was held at Oak Brook, Illinois, during September 8--10, 1992. This review involved some 70 national and international experts in different areas of neutron research, sources, and applications. Separate working groups were asked to (1) review the current status of advanced research reactors and spallation sources; and (2) provide an update on scientific, technological, and medical applications, including neutron scattering research in a number of disciplines, isotope production, materials irradiation, and other important uses of neutron sources such as materials analysis and fundamental neutron physics. This report summarizes the findings and conclusions of the different working groups involved in the review, and contains some of the best current expertise on neutron sources and applications.

  8. Neutron sources and applications

    International Nuclear Information System (INIS)

    Price, D.L.; Rush, J.J.

    1994-01-01

    Review of Neutron Sources and Applications was held at Oak Brook, Illinois, during September 8--10, 1992. This review involved some 70 national and international experts in different areas of neutron research, sources, and applications. Separate working groups were asked to (1) review the current status of advanced research reactors and spallation sources; and (2) provide an update on scientific, technological, and medical applications, including neutron scattering research in a number of disciplines, isotope production, materials irradiation, and other important uses of neutron sources such as materials analysis and fundamental neutron physics. This report summarizes the findings and conclusions of the different working groups involved in the review, and contains some of the best current expertise on neutron sources and applications

  9. Transient analysis of the new Cold Source at the FRM-II

    International Nuclear Information System (INIS)

    Gutsmiedl, E.; Posselt, H.; Scheuer, A.

    2003-01-01

    The new Cold Source (CNS) at the FRM-II research reactor is completely installed. This paper reports on the results of the transient analysis in the design status for this facility for producing cold neutrons for neutron experiments, the implementation of the results in the design of the mechanical components, the measurements at the cold tests and the comparison with the data of the transient analysis. The important load cases are fixed in the system description and the design data sheet of the CNS. A transient analysis was done with the computer program ESATAN, the nodal configuration was identical with the planned system of the CNS and the boundary conditions were chosen so, that conservative results can be expected. The following transients of the load cases in the piping system behind the inpile part 1) normal storage of D 2 at the hydride storage vessel 2) breakdown of cooling system of the CNS and transfer of D 2 to the buffer tank 3) rapid charge of D 2 to the buffer tank with break of the insulation vacuum and flooding of Neon 4) reloading of the D 2 from the buffer tank to the D 2 hydride storage vessel were calculated. Additionally the temperature distribution for these transients in the connecting flanges of the systems to the inpile part were analysed. The temperature distributions in the flange region were take into account for the strength calculation of the flange construction. The chosen construction shows allowable values and a leak tight flange connection for the load cases. The piping system was designed to the lowest expected temperatures. The load cases in the moderator tank were take into account in the stress analysis and the fatigue analysis of the vacuum vessel and the moderator vessel. The results shows allowable stresses. The results shows that a transient analysis is necessary and helpful for good design of the CNS. (author)

  10. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    International Nuclear Information System (INIS)

    Kooyman, T.; Buiron, L.; Rimpault, G.

    2017-01-01

    Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long- and short-term neutron and gamma source is carried out whereas in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing. (authors)

  11. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    Directory of Open Access Journals (Sweden)

    Kooyman Timothée

    2017-01-01

    Full Text Available Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long- and short-term neutron and gamma source is carried out whereas in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.

  12. Analysis and optimization of minor actinides transmutation blankets with regards to neutron and gamma sources

    Science.gov (United States)

    Kooymana, Timothée; Buiron, Laurent; Rimpault, Gérald

    2017-09-01

    Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long and short term neutron and gamma source is carried out while in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.

  13. Source localization analysis using seismic noise data acquired in exploration geophysics

    Science.gov (United States)

    Roux, P.; Corciulo, M.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring using seismic noise data shows a growing interest at exploration scale. Recent studies demonstrated source localization capability using seismic noise cross-correlation at observation scales ranging from hundreds of kilometers to meters. In the context of exploration geophysics, classical localization methods using travel-time picking fail when no evident first arrivals can be detected. Likewise, methods based on the intensity decrease as a function of distance to the source also fail when the noise intensity decay gets more complicated than the power-law expected from geometrical spreading. We propose here an automatic procedure developed in ocean acoustics that permits to iteratively locate the dominant and secondary noise sources. The Matched-Field Processing (MFP) technique is based on the spatial coherence of raw noise signals acquired on a dense array of receivers in order to produce high-resolution source localizations. Standard MFP algorithms permits to locate the dominant noise source by matching the seismic noise Cross-Spectral Density Matrix (CSDM) with the equivalent CSDM calculated from a model and a surrogate source position that scans each position of a 3D grid below the array of seismic sensors. However, at exploration scale, the background noise is mostly dominated by surface noise sources related to human activities (roads, industrial platforms,..), which localization is of no interest for the monitoring of the hydrocarbon reservoir. In other words, the dominant noise sources mask lower-amplitude noise sources associated to the extraction process (in the volume). Their location is therefore difficult through standard MFP technique. The Multi-Rate Adaptative Beamforming (MRABF) is a further improvement of the MFP technique that permits to locate low-amplitude secondary noise sources using a projector matrix calculated from the eigen-value decomposition of the CSDM matrix. The MRABF approach aims at cancelling the contributions of

  14. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  15. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  16. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  17. MILAGRO OBSERVATIONS OF MULTI-TeV EMISSION FROM GALACTIC SOURCES IN THE FERMI BRIGHT SOURCE LIST

    International Nuclear Information System (INIS)

    Abdo, A. A.; Linnemann, J. T.; Allen, B. T.; Chen, C.; Aune, T.; Berley, D.; Goodman, J. A.; Christopher, G. E.; Kolterman, B. E.; Mincer, A. I.; Nemethy, P.; DeYoung, T.; Dingus, B. L.; Hoffman, C. M.; Ellsworth, R. W.; Gonzalez, M. M.; Hays, E.; McEnery, J. E.; Huentemeyer, P. H.; Morgan, T.

    2009-01-01

    We present the result of a search of the Milagro sky map for spatial correlations with sources from a subset of the recent Fermi Bright Source List (BSL). The BSL consists of the 205 most significant sources detected above 100 MeV by the Fermi Large Area Telescope. We select sources based on their categorization in the BSL, taking all confirmed or possible Galactic sources in the field of view of Milagro. Of the 34 Fermi sources selected, 14 are observed by Milagro at a significance of 3 standard deviations or more. We conduct this search with a new analysis which employs newly optimized gamma-hadron separation and utilizes the full eight-year Milagro data set. Milagro is sensitive to gamma rays with energy from 1 to 100 TeV with a peak sensitivity from 10 to 50 TeV depending on the source spectrum and declination. These results extend the observation of these sources far above the Fermi energy band. With the new analysis and additional data, multi-TeV emission is definitively observed associated with the Fermi pulsar, J2229.0+6114, in the Boomerang pulsar wind nebula (PWN). Furthermore, an extended region of multi-TeV emission is associated with the Fermi pulsar, J0634.0+1745, the Geminga pulsar.

  18. Antioxidants: Characterization, natural sources, extraction and analysis.

    Science.gov (United States)

    Oroian, Mircea; Escriche, Isabel

    2015-08-01

    Recently many review papers regarding antioxidants from different sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, for example preventing cancer and cardiovascular diseases, and lowering the incidence of different diseases. In this paper the main classes of antioxidants are presented: vitamins, carotenoids and polyphenols. Recently, many analytical methodologies involving diverse instrumental techniques have been developed for the extraction, separation, identification and quantification of these compounds. Antioxidants have been quantified by different researchers using one or more of these methods: in vivo, in vitro, electrochemical, chemiluminescent, electron spin resonance, chromatography, capillary electrophoresis, nuclear magnetic resonance, near infrared spectroscopy and mass spectrometry methods. Copyright © 2015. Published by Elsevier Ltd.

  19. Screening of oil sources by using comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry and multivariate statistical analysis.

    Science.gov (United States)

    Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin

    2015-02-06

    Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Aerosol composition and source apportionment in the Mexico City Metropolitan Area with PIXE/PESA/STIM and multivariate analysis

    Directory of Open Access Journals (Sweden)

    K. S. Johnson

    2006-01-01

    Full Text Available Aerosols play an important role in the atmosphere but are poorly characterized, particularly in urban areas like the Mexico City Metropolitan Area (MCMA. The chemical composition of urban particles must be known to assess their effects on the environment, and specific particulate emissions sources should be identified to establish effective pollution control standards. For these reasons, samples of particulate matter ≤2.5 μm (PM2.5 were collected during the MCMA-2003 Field Campaign for elemental and multivariate analyses. Proton-Induced X-ray Emission (PIXE, Proton-Elastic Scattering Analysis (PESA and Scanning Transmission Ion Microscopy (STIM measurements were done to determine concentrations of 19 elements from Na to Pb, hydrogen, and total mass, respectively. The most abundant elements from PIXE analysis were S, Si, K, Fe, Ca, and Al, while the major emissions sources associated with these elements were industry, wind-blown soil, and biomass burning. Wind trajectories suggest that metals associated with industrial emissions came from northern areas of the city whereas soil aerosols came from the southwest and increased in concentration during dry conditions. Elemental markers for fuel oil combustion, V and Ni, correlated with a large SO2 plume to suggest an anthropogenic, rather than volcanic, emissions source. By subtracting major components of soil and sulfates determined by PIXE analysis from STIM total mass measurements, we estimate that approximately 50% of non-volatile PM2.5 consisted of carbonaceous material.

  1. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  2. Reliability of MEG source imaging of anterior temporal spikes: analysis of an intracranially characterized spike focus.

    Science.gov (United States)

    Wennberg, Richard; Cheyne, Douglas

    2014-05-01

    To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. L'électronique pour les débutants qui sèchent les cours mais soudent sans se brûler les doigts

    CERN Document Server

    Mallard, Rémy

    2012-01-01

    Par où commencer pour débuter en électronique ? Vais-je m'égarer en explorant l'internet ? Il regorge de schémas, mais sont-ils fiables ? Me faut-il un livre avec des montages simples ou plutôt un livre sur les composants ? Après trente ans de pratique, l'auteur de ce livre, resté l'éternel débutant qui réalisait lui-même son premier montage dès l'âge de dix ans, partage ici sa soif toujours vive d'apprendre. Fin pédagogue, il guide les débutants et répond aux questions que trop de livres laissent en suspens : "Quel type de fer à souder acheter ?"... "Un multimètre à 5 € peut-il suffire ?"... "Un oscilloscope est-il indispensable ?"... "Peut-on installer son montage dans une boîte à cigares ?"... Rémy Mallard démystifie l'électronique en n'utilisant que ce qu'il vous faut de théorie pour aborder joyeusement la pratique sans risque de faire de grosses bêtises. Vous apprendrez à identifier les composants et leur rôle (résistances, condensateurs, bobines, diodes, transistors, rela...

  4. The comparison of four neutron sources for Prompt Gamma Neutron Activation Analysis (PGNAA) in vivo detections of boron.

    Science.gov (United States)

    Fantidis, J G; Nicolaou, G E; Potolias, C; Vordos, N; Bandekas, D V

    A Prompt Gamma Ray Neutron Activation Analysis (PGNAA) system, incorporating an isotopic neutron source has been simulated using the MCNPX Monte Carlo code. In order to improve the signal to noise ratio different collimators and a filter were placed between the neutron source and the object. The effect of the positioning of the neutron beam and the detector relative to the object has been studied. In this work the optimisation procedure is demonstrated for boron. Monte Carlo calculations were carried out to compare the performance of the proposed PGNAA system using four different neutron sources ( 241 Am/Be, 252 Cf, 241 Am/B, and DT neutron generator). Among the different systems the 252 Cf neutron based PGNAA system has the best performance.

  5. YouTube as a source of COPD patient education: A social media content analysis

    Science.gov (United States)

    Stellefson, Michael; Chaney, Beth; Ochipa, Kathleen; Chaney, Don; Haider, Zeerak; Hanik, Bruce; Chavarria, Enmanuel; Bernhardt, Jay M.

    2014-01-01

    Objective Conduct a social media content analysis of COPD patient education videos on YouTube. Methods A systematic search protocol was used to locate 223 videos. Two independent coders evaluated each video to determine topics covered, media source(s) of posted videos, information quality as measured by HONcode guidelines for posting trustworthy health information on the Internet, and viewer exposure/engagement metrics. Results Over half the videos (n=113, 50.7%) included information on medication management, with far fewer videos on smoking cessation (n=40, 17.9%). Most videos were posted by a health agency or organization (n=128, 57.4%), and the majority of videos were rated as high quality (n=154, 69.1%). HONcode adherence differed by media source (Fisher’s Exact Test=20.52, p=.01), with user-generated content (UGC) receiving the lowest quality scores. Overall level of user engagement as measured by number of “likes,” “favorites,” “dislikes,” and user comments was low (mdn range = 0–3, interquartile (IQR) range = 0–16) across all sources of media. Conclusion Study findings suggest that COPD education via YouTube has the potential to reach and inform patients, however, existing video content and quality varies significantly. Future interventions should help direct individuals with COPD to increase their engagement with high-quality patient education videos on YouTube that are posted by reputable health organizations and qualified medical professionals. Patients should be educated to avoid and/or critically view low-quality videos posted by individual YouTube users who are not health professionals. PMID:24659212

  6. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Science.gov (United States)

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  7. Nontraditional renewable energy sources

    International Nuclear Information System (INIS)

    Shpil'rajn, Eh.Eh.

    1997-01-01

    The paper considers the application possibilities of nontraditional renewable energy sources to generate electricity, estimates the potential of nontraditional sources using energy of Sun, wind, biomass, as well as, geothermal energy and presents the results of economical analysis of cost of electricity generated by solar electrical power plants, geothermal and electrical plants and facilities for power reprocessing of biomass. 1 tab

  8. Polycyclic aromatic hydrocarbons in urban air : concentration levels and patterns and source analysis in Nairobi, Kenya

    Energy Technology Data Exchange (ETDEWEB)

    Muthini, M.; Yoshimichi, H.; Yutaka, K.; Shigeki, M. [Yokohama National Univ., Yokohama (Japan). Graduate School of Environment and Information Sciences

    2005-07-01

    Polycyclic aromatic hydrocarbons (PAHs) present in the environment are often the result of incomplete combustion processes. This paper reported concentration levels and patterns of high molecular weight PAHs in Nairobi, Kenya. Daily air samples for 30 different PAHs were collected at residential, industrial and business sites within the city. Samples were then extracted using deuterated PAH with an automated Soxhlet device. Gas chromatography and mass spectrometry (GC-MS) with a capillary column was used to analyze the extracts using a selected ion monitoring (SIM) mode. Statistical analyses were then performed. PAH concentration levels were reported for average, median, standard deviation, range, and Pearson's correlation coefficients. Data were then analyzed for sources using a principal component analysis (PCA) technique and isomer ratio analysis. Nonparametric testing was then conducted to detect inherent differences in PAH concentration data obtained from the different sites. Results showed that pyrene was the most abundant PAH. Carcinogenic PAHs were higher in high-traffic areas. The correlation coefficient between coronene and benzo(ghi)pyrene was high. The PAH isomer ratio analysis demonstrated that PAHs in Nairobi are the product of traffic emissions and oil combustion. Results also showed that PAH profiles were not well separated. It was concluded that source distinction methods must be improved in order to better evaluate PAH emissions in the city. 9 refs., 2 tabs., 1 fig.

  9. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  10. Source term analysis for a criticality accident in metal production line glove boxes

    International Nuclear Information System (INIS)

    Nguyen, D.H.

    1991-06-01

    A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion calculations. We have used an integrated computer model to determine the release of fission products to the environment from a hypothetical criticality event in a glove box of the metal production line (MPL) at the Lawrence Livermore National Laboratory (LLNL)

  11. Null stream analysis of Pulsar Timing Array data: localisation of resolvable gravitational wave sources

    Science.gov (United States)

    Goldstein, Janna; Veitch, John; Sesana, Alberto; Vecchio, Alberto

    2018-04-01

    Super-massive black hole binaries are expected to produce a gravitational wave (GW) signal in the nano-Hertz frequency band which may be detected by pulsar timing arrays (PTAs) in the coming years. The signal is composed of both stochastic and individually resolvable components. Here we develop a generic Bayesian method for the analysis of resolvable sources based on the construction of `null-streams' which cancel the part of the signal held in common for each pulsar (the Earth-term). For an array of N pulsars there are N - 2 independent null-streams that cancel the GW signal from a particular sky location. This method is applied to the localisation of quasi-circular binaries undergoing adiabatic inspiral. We carry out a systematic investigation of the scaling of the localisation accuracy with signal strength and number of pulsars in the PTA. Additionally, we find that source sky localisation with the International PTA data release one is vastly superior than what is achieved by its constituent regional PTAs.

  12. Supercontinuum light sources for food analysis

    DEFF Research Database (Denmark)

    Møller, Uffe Visbech; Petersen, Christian Rosenberg; Kubat, Irnis

    2014-01-01

    . One track of Light & Food will target the mid-infrared spectral region. To date, the limitations of mid-infraredlight sources, such as thermal emitters, low-power laser diodes, quantum cascade lasers and synchrotron radiation, have precluded mid-IR applications where the spatial coherence, broad...... bandwidth,high brightness and portability of a supercontinuum laser are all required. DTU Fotonik has now demonstrated the first optical fiber based broadband supercontinuum light souce, which covers 1.4-13.3μm and thereby most of the molecular fingerprint region....

  13. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  14. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  15. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  16. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  17. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign and threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.

  18. Reliability and validity analysis of the open-source Chinese Foot and Ankle Outcome Score (FAOS).

    Science.gov (United States)

    Ling, Samuel K K; Chan, Vincent; Ho, Karen; Ling, Fona; Lui, T H

    2017-12-21

    Develop the first reliable and validated open-source outcome scoring system in the Chinese language for foot and ankle problems. Translation of the English FAOS into Chinese following regular protocols. First, two forward-translations were created separately, these were then combined into a preliminary version by an expert committee, and was subsequently back-translated into English. The process was repeated until the original and back translations were congruent. This version was then field tested on actual patients who provided feedback for modification. The final Chinese FAOS version was then tested for reliability and validity. Reliability analysis was performed on 20 subjects while validity analysis was performed on 50 subjects. Tools used to validate the Chinese FAOS were the SF36 and Pain Numeric Rating Scale (NRS). Internal consistency between the FAOS subgroups was measured using Cronbach's alpha. Spearman's correlation was calculated between each subgroup in the FAOS, SF36 and NRS. The Chinese FAOS passed both reliability and validity testing; meaning it is reliable, internally consistent and correlates positively with the SF36 and the NRS. The Chinese FAOS is a free, open-source scoring system that can be used to provide a relatively standardised outcome measure for foot and ankle studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Implementation of inter-unit analysis for C and C++ languages in a source-based static code analyzer

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The proliferation of automated testing capabilities arises a need for thorough testing of large software systems, including system inter-component interfaces. The objective of this research is to build a method for inter-procedural inter-unit analysis, which allows us to analyse large and complex software systems including multi-architecture projects (like Android OS as well as to support complex assembly systems of projects. Since the selected Clang Static Analyzer uses source code directly as input data, we need to develop a special technique to enable inter-unit analysis for such analyzer. This problem is of special nature because of C and C++ language features that assume and encourage the separate compilation of project files. We describe the build and analysis system that was implemented around Clang Static Analyzer to enable inter-unit analysis and consider problems related to support of complex projects. We also consider the task of merging abstract source trees of translation units and its related problems such as handling conflicting definitions, complex build systems and complex projects support, including support for multi-architecture projects, with examples. We consider both issues related to language design and human-related mistakes (that may be intentional. We describe some heuristics that were used for this work to make the merging process faster. The developed system was tested using Android OS as the input to show it is applicable even for such complicated projects. This system does not depend on the inter-procedural analysis method and allows the arbitrary change of its algorithm.

  20. Perception and acceptance of technological risk sources. Volume 2. Empirical analysis of risk perception and acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Renn, O

    1981-01-01

    Volume 2 presents a comparative investigation of risk perception and acceptance. It contains the evaluations of the two experiments in social psychology and the analysis of two intensive inquiries concerning risk perception with a view to 12 different risk sources. The data of the two inquiries were acquired from a total of 200 interview partners in two cities in North-Rhine Westphalia.

  1. R -Dimensional ESPRIT-Type Algorithms for Strictly Second-Order Non-Circular Sources and Their Performance Analysis

    Science.gov (United States)

    Steinwandt, Jens; Roemer, Florian; Haardt, Martin; Galdo, Giovanni Del

    2014-09-01

    High-resolution parameter estimation algorithms designed to exploit the prior knowledge about incident signals from strictly second-order (SO) non-circular (NC) sources allow for a lower estimation error and can resolve twice as many sources. In this paper, we derive the R-D NC Standard ESPRIT and the R-D NC Unitary ESPRIT algorithms that provide a significantly better performance compared to their original versions for arbitrary source signals. They are applicable to shift-invariant R-D antenna arrays and do not require a centrosymmetric array structure. Moreover, we present a first-order asymptotic performance analysis of the proposed algorithms, which is based on the error in the signal subspace estimate arising from the noise perturbation. The derived expressions for the resulting parameter estimation error are explicit in the noise realizations and asymptotic in the effective signal-to-noise ratio (SNR), i.e., the results become exact for either high SNRs or a large sample size. We also provide mean squared error (MSE) expressions, where only the assumptions of a zero mean and finite SO moments of the noise are required, but no assumptions about its statistics are necessary. As a main result, we analytically prove that the asymptotic performance of both R-D NC ESPRIT-type algorithms is identical in the high effective SNR regime. Finally, a case study shows that no improvement from strictly non-circular sources can be achieved in the special case of a single source.

  2. Market Analysis and Consumer Impacts Source Document. Part I. The Motor Vehicle Market in the Late 1970's

    Science.gov (United States)

    1980-12-01

    The source document on motor vehicle market analysis and consumer impact consists of three parts. Part I is an integrated overview of the motor vehicle market in the late 1970's, with sections on the structure of the market, motor vehicle trends, con...

  3. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part II: Data Sources from Specific Library Applications

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the second part of a two-part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage in statistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  4. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  5. Optimization of a polarized source for in vivo x-ray fluorescence analysis of platinum and other heavy metals

    International Nuclear Information System (INIS)

    Lewis, D.G.

    1994-01-01

    The Monte Carlo method was used to optimize a polarized photon source for the x-ray fluorescence analysis of platinum and other heavy metals in vivo. The source consisted of a 140 kVp, 25 mA x-ray tube with the photons plane-polarized by 90 o scattering. The use of plane-polarized photons results in a significant reduction in background when the fluorescent radiation is measured along the direction of polarization. A Monte Carlo computer programme was written to simulate the production and interaction of polarized photons in order to determine the optimal polarizing material and dimensions, together with beam width and geometrical arrangement of source, polarizer and beam collimators. Calculated photon energy distributions are compared with experimental data to test the validity of the model. (author)

  6. Physical performance analysis and progress of the development of the negative ion RF source for the ITER NBI system

    International Nuclear Information System (INIS)

    Fantz, U.; Franzen, P.; Kraus, W.; Berger, M.; Christ-Koch, S.; Falter, H.; Froeschle, M.; Gutser, R.; Heinemann, B.; Martens, C.; McNeely, P.; Riedl, R.; Speth, E.; Staebler, A.; Wuenderlich, D.

    2009-01-01

    For heating and current drive the neutral beam injection (NBI) system for ITER requires a 1 MeV deuterium beam for up to 1 h pulse length. In order to inject the required 17 MW the large area source (1.9 m x 0.9 m) has to deliver 40 A of negative ion current at the specified source pressure of 0.3 Pa. In 2007, the IPP RF driven negative hydrogen ion source was chosen by the ITER board as the new reference source for the ITER NBI system due to, in principle, its maintenance free operation and the progress in the RF source development. The performance analysis of the IPP RF sources is strongly supported by an extensive diagnostic program and modelling of the source and beam extraction. The control of the plasma chemistry and the processes in the plasma region near the extraction system are the most critical topics for source optimization both for long pulse operation as well as for the source homogeneity. The long pulse stability has been demonstrated at the test facility MANITU which is now operating routinely at stable pulses of up to 10 min with parameters near the ITER requirements. A quite uniform plasma illumination of a large area source (0.8 m x 0.8 m) has been demonstrated at the ion source test facility RADI. The new test facility ELISE presently planned at IPP is being designed for long pulse plasma operation and short pulse, but large-scale extraction from a half-size ITER source which is an important intermediate step towards ITER NBI.

  7. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient response of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5. 12 refs., 3 figs

  8. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient responses of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5

  9. Vrancea seismic source analysis using a small-aperture array

    International Nuclear Information System (INIS)

    Popescu, E.; Popa, M.; Radulian, M.; Placinta, A.O.

    2005-01-01

    A small-aperture seismic array (BURAR) was installed in 1999 in the northern part of the Romanian territory (Bucovina area). Since then, the array has been in operation under a joint cooperation programme between Romania and USA. The array consists of 10 stations installed in boreholes (nine short period instruments and one broadband instrument) with enough high sensitivity to properly detect earthquakes generated in Vrancea subcrustal domain (at about 250 km epicentral distance) with magnitude M w below 3. Our main purpose is to investigate and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using specific techniques provided by the BURAR array data. Forty earthquakes with magnitudes between 2.9 and 6.0 were selected, including the recent events of September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7), October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w = 6.0) and May 14, 2005 (45.66 angle N, 26.52 angle E, h = 146 km, M w = 5.1), which are the best ever recorded earthquakes on the Romanian territory: Empirical Green's function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Our study shows the capability of the BURAR array to determine the source parameters of the Vrancea intermediate-depth earthquakes as a stand alone station and proves that the recordings of this array alone provides reliable and useful tools to efficiently constrain the source parameters and consequently source scaling properties. (authors)

  10. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  11. Regularity increases middle latency evoked and late induced beta brain response following proprioceptive stimulation

    DEFF Research Database (Denmark)

    Arnfred, Sidse M.; Hansen, Lars Kai; Parnas, Josef

    2008-01-01

    as an indication of increased readiness. This is achieved through detailed analysis of both evoked and induced responses in the time-frequency domain. Electroencephalography in a 64 channels montage was recorded in four-teen healthy subjects. Two paradigms were explored: A Regular alternation between hand......). After initial exploration of the AvVVT and Induced collapsed files of all subjects using two-way factor analyses (Non-Negative Matrix Factorization), further data decomposition was performed in restricted windows of interest (WOI). Main effects of side of stimulation, onset or offset, regularity...

  12. Power source roadmaps using bibliometrics and database tomography

    International Nuclear Information System (INIS)

    Kostoff, R.N.; Tshiteya, R.; Pfeil, K.M.; Humenik, J.A.; Karypis, G.

    2005-01-01

    Database Tomography (DT) is a textual database analysis system consisting of two major components: (1) algorithms for extracting multi-word phrase frequencies and phrase proximities (physical closeness of the multi-word technical phrases) from any type of large textual database, to augment (2) interpretative capabilities of the expert human analyst. DT was used to derive technical intelligence from a Power Sources database derived from the Science Citation Index. Phrase frequency analysis by the technical domain experts provided the pervasive technical themes of the Power Sources database, and the phrase proximity analysis provided the relationships among the pervasive technical themes. Bibliometric analysis of the Power Sources literature supplemented the DT results with author/journal/institution/country publication and citation data

  13. The Chandra Source Catalog: Source Properties and Data Products

    Science.gov (United States)

    Rots, Arnold; Evans, Ian N.; Glotfelty, Kenny J.; Primini, Francis A.; Zografou, Panagoula; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is breaking new ground in several areas. There are two aspects that are of particular interest to the users: its evolution and its contents. The CSC will be a living catalog that becomes richer, bigger, and better in time while still remembering its state at each point in time. This means that users will be able to take full advantage of new additions to the catalog, while retaining the ability to back-track and return to what was extracted in the past. The CSC sheds the limitations of flat-table catalogs. Its sources will be characterized by a large number of properties, as usual, but each source will also be associated with its own specific data products, allowing users to perform mini custom analysis on the sources. Source properties fall in the spatial (position, extent), photometric (fluxes, count rates), spectral (hardness ratios, standard spectral fits), and temporal (variability probabilities) domains, and are all accompanied by error estimates. Data products cover the same coordinate space and include event lists, images, spectra, and light curves. In addition, the catalog contains data products covering complete observations: event lists, background images, exposure maps, etc. This work is supported by NASA contract NAS8-03060 (CXC).

  14. Source Term Analysis for the Nuclear Power Station Goesgen-Daeniken; Quelltermanalysen fuer das Kernkraftwerk Goesgen-Daeniken

    Energy Technology Data Exchange (ETDEWEB)

    Hosemann, J.P.; Megaritis, G.; Guentay, S.; Hirschmann, H.; Luebbesmeyer, D.; Lieber, K.; Jaeckel, B.; Birchley, J.; Duijvestijn, G

    2001-08-01

    Analyses are performed for three accident scenarios postulated to occur in the Goesgen Nuclear Power Plant, a 900 MWe Pressurised Water Reactor of Siemens design. The scenarios investigated comprise a Station Blackout and two separate cases of small break loss-of-coolant accident which lead, respectively, to high, intermediate and low pressure conditions in the reactor system. In each case the accident assumptions are highly pessimistic, so that the sequences span a large range of plant states and a damage phenomena. Thus the plant is evaluated for a diversity of potential safety challenges. A suite of analysis tools are used to examine the reactor coolant system response, the core heat-up, melting, fission product release from the reactor system, the transport and chemical behaviour of those fission products in the containment building, and the release of radioactivity (source term) to the environment. Comparison with reference values used by the licensing authority shows that the use of modern analysis tools and current knowledge can provide substantial reduction in the estimated source term. Of particular interest are insights gained from the analyses which indicate opportunities for operators to reduce or forestall the release. (author)

  15. Special Analysis for the Disposal of the Materials and Energy Corporation Sealed Sources at the Area 5 Radioactive Waste Management Site

    Energy Technology Data Exchange (ETDEWEB)

    Shott, Gregory [National Security Technologies, LLC. (NSTec), Mercury, NV (United States)

    2017-05-15

    This special analysis (SA) evaluates whether the Materials and Energy Corporation (M&EC) Sealed Source waste stream (PERM000000036, Revision 0) is suitable for shallow land burial (SLB) at the Area 5 Radioactive Waste Management Site (RWMS) on the Nevada National Security Site (NNSS). Disposal of the M&EC Sealed Source waste meets all U.S. Department of Energy (DOE) Manual DOE M 435.1-1, “Radioactive Waste Management Manual,” Chapter IV, Section P performance objectives (DOE 1999). The M&EC Sealed Source waste stream is recommended for acceptance without conditions.

  16. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  17. Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China

    Science.gov (United States)

    Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema

    2018-04-01

    Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.

  18. Analysis of hard X-ray emission from selected very high energy γ-ray sources observed with INTEGRAL

    International Nuclear Information System (INIS)

    Hoffmann, Agnes Irene Dorothee

    2009-01-01

    A few years ago, the era of very high energy γ-ray astronomy started, when the latest generation of Imaging Atmospheric Cherenkov Telescopes (IACT) like H.E.S.S. began to operate and to resolve the sources of TeV emission. Identifications via multi-wavelength studies reveal that the detected sources are supernova remnants and active galactic nuclei, but also pulsar wind nebulae and a few binaries. One widely discussed open question is, how these sources are able to accelerate particles to such high energies. The understanding of the underlying particle distribution, the acceleration processes taking place, and the knowledge of the radiation processes which produce the observed emission, is, therefore, of crucial interest. Observations in the hard X-ray domain can be a key to get information on these particle distributions and processes. Important for this thesis are the TeV and the hard X-ray range. The two instruments, H.E.S.S. and INTEGRAL, whose data were used, are, therefore, described in detail. The main part of this thesis is focused on the X-ray binary system LS 5039/RX J1826.2-1450. It was observed in several energy ranges. The nature of the compact object is still not known, and it was proposed either to be a microquasar system or a non-accreting pulsar system. The observed TeV emission is modulated with the orbital cycle. Several explanations for this variability have been discussed in recent years. The observations with INTEGRAL presented in this thesis have provided new information to solve this question. Therefore, a search for a detection in the hard X-ray range and for its orbital dependence was worthwhile. Since LS 5039 is a faint source and the sky region where it is located is crowded, a very careful, non-standard handling of the INTEGRAL data was necessary, and a cross-checking with other analysis methods was essential to provide reliable results. We found that LS 5039 is emitting in the hard X-ray energy range. A flux rate and an upper flux

  19. Analysis of hard X-ray emission from selected very high energy {gamma}-ray sources observed with INTEGRAL

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, Agnes Irene Dorothee

    2009-11-13

    A few years ago, the era of very high energy {gamma}-ray astronomy started, when the latest generation of Imaging Atmospheric Cherenkov Telescopes (IACT) like H.E.S.S. began to operate and to resolve the sources of TeV emission. Identifications via multi-wavelength studies reveal that the detected sources are supernova remnants and active galactic nuclei, but also pulsar wind nebulae and a few binaries. One widely discussed open question is, how these sources are able to accelerate particles to such high energies. The understanding of the underlying particle distribution, the acceleration processes taking place, and the knowledge of the radiation processes which produce the observed emission, is, therefore, of crucial interest. Observations in the hard X-ray domain can be a key to get information on these particle distributions and processes. Important for this thesis are the TeV and the hard X-ray range. The two instruments, H.E.S.S. and INTEGRAL, whose data were used, are, therefore, described in detail. The main part of this thesis is focused on the X-ray binary system LS 5039/RX J1826.2-1450. It was observed in several energy ranges. The nature of the compact object is still not known, and it was proposed either to be a microquasar system or a non-accreting pulsar system. The observed TeV emission is modulated with the orbital cycle. Several explanations for this variability have been discussed in recent years. The observations with INTEGRAL presented in this thesis have provided new information to solve this question. Therefore, a search for a detection in the hard X-ray range and for its orbital dependence was worthwhile. Since LS 5039 is a faint source and the sky region where it is located is crowded, a very careful, non-standard handling of the INTEGRAL data was necessary, and a cross-checking with other analysis methods was essential to provide reliable results. We found that LS 5039 is emitting in the hard X-ray energy range. A flux rate and an upper

  20. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  2. The Human Face of Health News: A Multi-Method Analysis of Sourcing Practices in Health-Related News in Belgian Magazines.

    Science.gov (United States)

    De Dobbelaer, Rebeca; Van Leuven, Sarah; Raeymaeckers, Karin

    2018-05-01

    Health journalists are central gatekeepers who select, frame, and communicate health news to a broad audience, but the selection and content of health news are also influenced by the sources journalists, rely on (Hinnant, Len-Rios, & Oh, 2012). In this paper, we examine whether the traditional elitist sourcing practices (e.g., research institutions, government) are still important in a digitalized news environment where bottom-up non-elite actors (e.g., patients, civil society organizations) can act as producers (Bruns, 2003). Our main goal, therefore, is to detect whether sourcing practices in health journalism can be linked with strategies of empowerment. We use a multi-method approach combining quantitative and qualitative research methods. First, two content analyses are developed to examine health-related news in Belgian magazines (popular weeklies, health magazines, general interest magazines, and women's magazines). The analyses highlight sourcing practices as visible in the texts and give an overview of the different stakeholders represented as sources. In the first wave, the content analysis includes 1047 health-related news items in 19 different Belgian magazines (March-June 2013). In the second wave, a smaller sample of 202 health-related items in 10 magazines was studied for follow-up reasons (February 2015). Second, to contextualize the findings of the quantitative analysis, we interviewed 16 health journalists and editors-in-chief. The results illustrate that journalists consider patients and blogs as relevant sources for health news; nonetheless, elitist sourcing practices still prevail at the cost of bottom-up communication. However, the in-depth interviews demonstrate that journalists increasingly consult patients and civil society actors to give health issues a more "human" face. Importantly, the study reveals that this strategy is differently applied by the various types of magazines. While popular weeklies and women's magazines give a voice to

  3. Optimization of H.E.S.S. instrumental performances for the analysis of weak gamma-ray sources: Application to the study of HESS J1832-092

    International Nuclear Information System (INIS)

    Laffon, H.

    2012-01-01

    H.E.S.S. (High Energy Stereoscopic System) is an array of very-high energy gamma-ray telescopes located in Namibia. These telescopes take advantage of the atmospheric Cherenkov technique using stereoscopy, allowing to detect gamma-rays between 100 GeV and a few tens of TeV. The location of the H.E.S.S. telescopes in the Southern hemisphere allows to observe the central parts of our galaxy, the Milky Way. Tens of new gamma-ray sources were thereby discovered thanks to the galactic plane survey strategy. After ten years of fruitful observations with many detections, it is now necessary to improve the detector performance in order to detect new sources by increasing the sensitivity and improving the angular resolution. The aim of this thesis consists in the development of advanced analysis techniques allowing to make sharper analysis. An automatic tool to look for new sources and to improve the subtraction of the background noise is presented. It is optimized for the study of weak sources that needs a very rigorous analysis. A combined reconstruction method is built in order to improve the angular resolution without reducing the statistics, which is critical for weak sources. These advanced methods are applied to the analysis of a complex region of the galactic plane near the supernova remnant G22.7-0.2, leading to the detection of a new source, HESS J1832-092. Multi-wavelength counterparts are shown and several scenarios are considered to explain the origin of the gamma-ray signal of this astrophysical object. (author)

  4. Using open source accelerometer analysis to assess physical activity and sedentary behaviour in overweight and obese adults.

    Science.gov (United States)

    Innerd, Paul; Harrison, Rory; Coulson, Morc

    2018-04-23

    Physical activity and sedentary behaviour are difficult to assess in overweight and obese adults. However, the use of open-source, raw accelerometer data analysis could overcome this. This study compared raw accelerometer and questionnaire-assessed moderate-to-vigorous physical activity (MVPA), walking and sedentary behaviour in normal, overweight and obese adults, and determined the effect of using different methods to categorise overweight and obesity, namely body mass index (BMI), bioelectrical impedance analysis (BIA) and waist-to-hip ratio (WHR). One hundred twenty adults, aged 24-60 years, wore a raw, tri-axial accelerometer (Actigraph GT3X+), for 3 days and completed a physical activity questionnaire (IPAQ-S). We used open-source accelerometer analyses to estimate MVPA, walking and sedentary behaviour from a single raw accelerometer signal. Accelerometer and questionnaire-assessed measures were compared in normal, overweight and obese adults categorised using BMI, BIA and WHR. Relationships between accelerometer and questionnaire-assessed MVPA (Rs = 0.30 to 0.48) and walking (Rs = 0.43 to 0.58) were stronger in normal and overweight groups whilst sedentary behaviour were modest (Rs = 0.22 to 0.38) in normal, overweight and obese groups. The use of WHR resulted in stronger agreement between the questionnaire and accelerometer than BMI and BIA. Finally, accelerometer data showed stronger associations with BMI, BIA and WHR (Rs = 0.40 to 0.77) than questionnaire data (Rs = 0.24 to 0.37). Open-source, raw accelerometer data analysis can be used to estimate MVPA, walking and sedentary behaviour from a single acceleration signal in normal, overweight and obese adults. Our data supports the use of WHR to categorise overweight and obese adults. This evidence helps researchers obtain more accurate measures of physical activity and sedentary behaviour in overweight and obese populations.

  5. Impedance Source Power Electronic Converters

    DEFF Research Database (Denmark)

    Liu, Yushan; Abu-Rub, Haitham; Ge, Baoming

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable...... and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key...... features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding...

  6. Design-Oriented Analysis of Resonance Damping and Harmonic Compensation for LCL-Filtered Voltage Source Converters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Loh, Poh Chiang

    2014-01-01

    This paper addresses the interaction between harmonic resonant controllers and active damping of LCL resonance in voltage source converters. A virtual series R-C damper in parallel with the filter capacitor is proposed with the capacitor current feedback loop. The phase lag resulting from...... crossover frequency defined by the proportional gain of current controller. This is of particular interest for high-performance active harmonic filtering applications and low-pulse-ratio converters. Case studies in experiments validate the theoretical analysis....

  7. Source-Drain Punch-Through Analysis of High Voltage Off-State AlGaN/GaN HEMT Breakdown

    Science.gov (United States)

    Jiang, H.; Li, X.; Wang, J.; Zhu, L.; Wang, H.; Liu, J.; Wang, M.; Yu, M.; Wu, W.; Zhou, Y.; Dai, G.

    2017-06-01

    AlGaN/GaN high-electron mobility transistor’s (HEMT’s) off-state breakdown is investigated using conventional three-terminal off-state breakdown I-V measurement. Competition between gate leakage and source-injection buffer leakage (SIBL) is discussed in detail. It is found that the breakdown is dominated by source-injection which is sensitive to gate voltage and gate length at large gate-to-drain spacing (Lgd > 7μm), where a threshold drain voltage of the occurrence of the SIBL current in GaN buffer exists, and after this threshold voltage the SIBL current continually increased till the buffer breakdown. Our analysis showed that due to the punch-through effect in the buffer, a potential barrier between 2DEG and GaN buffer at the source side mainly controlled by the drain voltage determines the buffer leakage current and the occurrence of the following buffer breakdown, which could explain the experimentally observed breakdown phenomenon.

  8. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  9. Fine particulates over South Asia: Review and meta-analysis of PM2.5 source apportionment through receptor model.

    Science.gov (United States)

    Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar

    2017-04-01

    Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was

  10. Analysis of carbon monoxide production in multihundred-watt heat sources

    International Nuclear Information System (INIS)

    Peterson, D.E.; Mulford, R.N.R.

    1976-05-01

    The production of carbon monoxide observed within Multihundred Watt heat sources placed under storage conditions was analyzed. Results of compositional and isotopic analyses of gas taps performed on eight heat sources are summarized and interpreted. Several proposed CO generation mechanisms are examined theoretically and assessed by applying thermodynamic principles. Outgassing of the heat source graphite followed by oxygen isotopic exchange through the vent assemblies appears to explain the CO production at storage temperatures. Reduction of the plutonia fuel sphere by the CO is examined as a function of temperature and stoichiometry. Experiments that could be performed to investigate possible CO generation mechanisms are discussed

  11. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  12. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  13. Source-Independent Quantum Random Number Generation

    Science.gov (United States)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  14. The Use of Principal Component Analysis for Source Identification of PM2.5 from Selected Urban and Regional Background Sites in Poland

    Science.gov (United States)

    Błaszczak, Barbara

    2018-01-01

    The paper reports the results of the measurements of water-soluble ions and carbonaceous matter content in the fine particulate matter (PM2.5), as well as the contributions of major sources in PM2.5. Daily PM2.5 samples were collected during heating and non-heating season of the year 2013 in three different locations in Poland: Szczecin (urban background), Trzebinia (urban background) and Złoty Potok (regional background). The concentrations of PM2.5, and its related components, exhibited clear spatiotemporal variability with higher levels during the heating period. The share of the total carbon (TC) in PM2.5 exceeded 40% and was primarily determined by fluctuations in the share of OC. Sulfates (SO42-), nitrates (NO3-) and ammonium (NH4+) dominated in the ionic composition of PM2.5 and accounted together 34% (Szczecin), 30% (Trzebinia) and 18% (Złoty Potok) of PM2.5 mass. Source apportionment analysis, performed by PCA-MLRA model (Principal Component Analysis - Multilinear Regression Analysis), revealed that secondary aerosol, whose presence is related to oxidation of gaseous precursors emitted from fuel combustion and biomass burning, had the largest contribution in observed PM2.5 concentrations. In addition, the contribution of traffic sources together with road dust resuspension, was observed. The share of natural sources (sea spray, crustal dust) was generally lower.

  15. The Use of Principal Component Analysis for Source Identification of PM2.5 from Selected Urban and Regional Background Sites in Poland

    Directory of Open Access Journals (Sweden)

    Błaszczak Barbara

    2018-01-01

    Full Text Available The paper reports the results of the measurements of water-soluble ions and carbonaceous matter content in the fine particulate matter (PM2.5, as well as the contributions of major sources in PM2.5. Daily PM2.5 samples were collected during heating and non-heating season of the year 2013 in three different locations in Poland: Szczecin (urban background, Trzebinia (urban background and Złoty Potok (regional background. The concentrations of PM2.5, and its related components, exhibited clear spatiotemporal variability with higher levels during the heating period. The share of the total carbon (TC in PM2.5 exceeded 40% and was primarily determined by fluctuations in the share of OC. Sulfates (SO42-, nitrates (NO3- and ammonium (NH4+ dominated in the ionic composition of PM2.5 and accounted together ~34% (Szczecin, ~30% (Trzebinia and ~18% (Złoty Potok of PM2.5 mass. Source apportionment analysis, performed by PCA-MLRA model (Principal Component Analysis – Multilinear Regression Analysis, revealed that secondary aerosol, whose presence is related to oxidation of gaseous precursors emitted from fuel combustion and biomass burning, had the largest contribution in observed PM2.5 concentrations. In addition, the contribution of traffic sources together with road dust resuspension, was observed. The share of natural sources (sea spray, crustal dust was generally lower.

  16. 252Cf-source-driven noise analysis measurements for characterization of concrete highly enriched uranium (HEU) storage vaults

    International Nuclear Information System (INIS)

    Valentine, T.E.; Mihalczo, J.T.

    1993-01-01

    The 252 Cf-source-driven noise analysis method has been used in measurements for subcritical configurations of fissile systems for a variety of applications. Measurements of 25 fissile systems have been performed with a wide variety of materials and configurations. This method has been applied to measurements for (1) initial fuel loading of reactors, (2) quality assurance of reactor fuel elements, (3) fuel preparation facilities, (4) fuel processing facilities, (5) fuel storage facilities, (6) zero-power testing of reactors, and (7) verification of calculational methods for assemblies with the neutron k 252 Cf source and commercially available detectors was feasible and to determine if the measurement could characterize the ability of the concrete to isolate the fissile material

  17. Advanced neutron source reactor conceptual safety analysis report, three-element-core design: Chapter 15, accident analysis

    International Nuclear Information System (INIS)

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L.; Harrington, R.M.

    1996-02-01

    In order to utilize reduced enrichment fuel, the three-element-core design for the Advanced Neutron Source has been proposed. The proposed core configuration consists of inner, middle, and outer elements, with the middle element offset axially beneath the inner and outer elements, which are axially aligned. The three-element-core RELAP5 model assumes that the reactor hardware is changed only within the core region, so that the loop piping, heat exchangers, and pumps remain as assumed for the two-element-core configuration. To assess the impact of changes in the core region configuration and the thermal-hydraulic steady-state conditions, the safety analysis has been updated. This report gives the safety margins for the loss-of-off-site power and pressure-boundary fault accidents based on the RELAP5 results. AU margins are greater for the three-element-core simulations than those calculated for the two-element core

  18. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  19. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Patrick [Kitware, Inc., Clifton Park, NY (United States)

    2017-01-30

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energy advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.

  20. Analysis of Paralleling Limited Capacity Voltage Sources by Projective Geometry Method

    Directory of Open Access Journals (Sweden)

    Alexandr Penin

    2014-01-01

    Full Text Available The droop current-sharing method for voltage sources of a limited capacity is considered. Influence of equalizing resistors and load resistor is investigated on uniform distribution of relative values of currents when the actual loading corresponds to the capacity of a concrete source. Novel concepts for quantitative representation of operating regimes of sources are entered with use of projective geometry method.

  1. An Earthquake Source Sensitivity Analysis for Tsunami Propagation in the Eastern Mediterranean

    Science.gov (United States)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2013-04-01

    An earthquake source parameter sensitivity analysis for tsunami propagation in the Eastern Mediterranean has been performed based on 8 August 1303 Crete and Dodecanese Islands earthquake resulting in destructive inundation in the Eastern Mediterranean. The analysis involves 23 cases describing different sets of strike, dip, rake and focal depth, while keeping the fault area and displacement, thus the magnitude, same. The main conclusions of the evaluation are drawn from the investigation of the wave height distributions at Tsunami Forecast Points (TFP). The earthquake vs. initial tsunami source parameters comparison indicated that the maximum initial wave height values correspond in general to the changes in rake angle. No clear depth dependency is observed within the depth range considered and no strike angle dependency is observed in terms of amplitude change. Directivity sensitivity analysis indicated that for the same strike and dip, 180° shift in rake may lead to 20% change in the calculated tsunami wave height. Moreover, an approximately 10 min difference in the arrival time of the initial wave has been observed. These differences are, however, greatly reduced in the far field. The dip sensitivity analysis, performed separately for thrust and normal faulting, has both indicated that an increase in the dip angle results in the decrease of the tsunami wave amplitude in the near field approximately 40%. While a positive phase shift is observed, the period and the shape of the initial wave stays nearly the same for all dip angles at respective TFPs. These affects are, however, not observed at the far field. The resolution of the bathymetry, on the other hand, is a limiting factor for further evaluation. Four different cases were considered for the depth sensitivity indicating that within the depth ranges considered (15-60 km), the increase of the depth has only a smoothing effect on the synthetic tsunami wave height measurements at the selected TFPs. The strike

  2. Source inversion in the full-wave tomography; Full wave tomography ni okeru source inversion

    Energy Technology Data Exchange (ETDEWEB)

    Tsuchiya, T [DIA Consultants Co. Ltd., Tokyo (Japan)

    1997-10-22

    In order to consider effects of characteristics of a vibration source in the full-wave tomography (FWT), a study has been performed on a method to invert vibration source parameters together with V(p)/V(s) distribution. The study has expanded an analysis method which uses as the basic the gradient method invented by Tarantola and the partial space method invented by Sambridge, and conducted numerical experiments. The experiment No. 1 has performed inversion of only the vibration source parameters, and the experiment No. 2 has executed simultaneous inversion of the V(p)/V(s) distribution and the vibration source parameters. The result of the discussions revealed that and effective analytical procedure would be as follows: in order to predict maximum stress, the average vibration source parameters and the property parameters are first inverted simultaneously; in order to estimate each vibration source parameter at a high accuracy, the property parameters are fixed, and each vibration source parameter is inverted individually; and the derived vibration source parameters are fixed, and the property parameters are again inverted from the initial values. 5 figs., 2 tabs.

  3. Prompt-gamma neutron activation analysis system design: Effects of D-T versus D-D neutron generator source selection

    Science.gov (United States)

    Prompt-gamma neutron activation (PGNA) analysis is used for the non-invasive measurement of human body composition. Advancements in portable, compact neutron generator design have made those devices attractive as neutron sources. Two distinct generators are available: D-D with 2.5 MeV and D-T with...

  4. Source analysis of spaceborne microwave radiometer interference over land

    Science.gov (United States)

    Guan, Li; Zhang, Sibo

    2016-03-01

    Satellite microwave thermal emissions mixed with signals from active sensors are referred to as radiofrequency interference (RFI). Based on Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) observations from June 1 to 16, 2011, RFI over Europe was identified and analyzed using the modified principal component analysis algorithm in this paper. The X band AMSR-E measurements in England and Italy are mostly affected by the stable, persistent, active microwave transmitters on the surface, while the RFI source of other European countries is the interference of the reflected geostationary TV satellite downlink signals to the measurements of spaceborne microwave radiometers. The locations and intensities of the RFI induced by the geostationary TV and communication satellites changed with time within the observed period. The observations of spaceborne microwave radiometers in ascending portions of orbits are usually interfered with over European land, while no RFI was detected in descending passes. The RFI locations and intensities from the reflection of downlink radiation are highly dependent upon the relative geometry between the geostationary satellite and the measuring passive sensor. Only these fields of view of a spaceborne instrument whose scan azimuths are close to the azimuth relative to the geostationary satellite are likely to be affected by RFI.

  5. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  6. Low-level waste disposal performance assessments - Total source-term analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wilhite, E.L.

    1995-12-31

    Disposal of low-level radioactive waste at Department of Energy (DOE) facilities is regulated by DOE. DOE Order 5820.2A establishes policies, guidelines, and minimum requirements for managing radioactive waste. Requirements for disposal of low-level waste emplaced after September 1988 include providing reasonable assurance of meeting stated performance objectives by completing a radiological performance assessment. Recently, the Defense Nuclear Facilities Safety Board issued Recommendation 94-2, {open_quotes}Conformance with Safety Standards at Department of Energy Low-Level Nuclear Waste and Disposal Sites.{close_quotes} One of the elements of the recommendation is that low-level waste performance assessments do not include the entire source term because low-level waste emplaced prior to September 1988, as well as other DOE sources of radioactivity in the ground, are excluded. DOE has developed and issued guidance for preliminary assessments of the impact of including the total source term in performance assessments. This paper will present issues resulting from the inclusion of all DOE sources of radioactivity in performance assessments of low-level waste disposal facilities.

  7. Activation analysis of stainless steel flux monitors using 252Cf neutron sources

    International Nuclear Information System (INIS)

    Williams, J.G.; Newton, T.H. Jr.; Cogburn, C.O.

    1984-01-01

    Activation analysis was performed on stainless steel beads from a chain which is used in reactor pressure vessel surveillance experiments at the Arkansas Power and Light Company reactors. The beads allow monitoring of two fast and three thermal neutron induced reactions: 58 Ni(n,p) 58 Co, 54 Fe(n,p) 54 Mn, 58 Fe(n,γ) 59 Fe, 59 Co(n,γ) 60 Co and 50 Cr(n,γ) 51 Cr. The analysis was performed using 12 beads from various positions along 5 different batches of chain and standard materials in an H 2 O moderator tank using two intense californium sources which had a total neutron emission rate of 3.97 x 10 10 /s. Semiconductor gamma spectrometers were used to count the products of the above reactions in the specimens. The percentage by weight of the iron, chromium and cobalt in the beads were found to be 62.1%, 20.2% and 0.120%, respectively. The excellent uniformity found in the bead compositions demonstrates the reproducibility of the experimental techniques and enhances considerably the value of the beads as neutron flux montitors

  8. Calibrate the aerial surveying instrument by the limited surface source and the single point source that replace the unlimited surface source

    CERN Document Server

    Lu Cun Heng

    1999-01-01

    It is described that the calculating formula and surveying result is found on the basis of the stacking principle of gamma ray and the feature of hexagonal surface source when the limited surface source replaces the unlimited surface source to calibrate the aerial survey instrument on the ground, and that it is found in the light of the exchanged principle of the gamma ray when the single point source replaces the unlimited surface source to calibrate aerial surveying instrument in the air. Meanwhile through the theoretical analysis, the receiving rate of the crystal bottom and side surfaces is calculated when aerial surveying instrument receives gamma ray. The mathematical expression of the gamma ray decaying following height according to the Jinge function regularity is got. According to this regularity, the absorbing coefficient that air absorbs the gamma ray and the detective efficiency coefficient of the crystal is calculated based on the ground and air measuring value of the bottom surface receiving cou...

  9. [Bibliometric analysis of literature regarding integrated schistosomiasis control strategy with emphasis on infectious source control].

    Science.gov (United States)

    Qian, Yi-Li; Wang, Wei; Hong, Qing-Biao; Liang, You-Sheng

    2014-12-01

    To evaluate the outcomes of implementation of integrated schistosomiasis control strategy with emphasis on infectious source control using a bibliometric method. The literature pertaining to integrated schistosomiasis control strategy with emphasis on infectious source control was retrieved from CNKI, Wanfangdata, VIP, PubMed, Web of Science, BIOSIS and Google Scholar, and a bibliometric analysis of literature captured was performed. During the period from January 1, 2004 through September 30, 2014, a total of 94 publications regarding integrated schistosomiasis control strategy with emphasis on infectious source control were captured, including 78 Chinese articles (82.98%) and 16 English papers (17.02%). The Chinese literature was published in 21 national journals, and Chinese Journal of Schistosomiasis Control had the largest number of publications, consisting of 37.23% of total publications; 16 English papers were published in 12 international journals, and PLoS Neglected Tropical Diseases had the largest number of publications (3 publications). There were 37 affiliations publishing these 94 articles, and National Institute of Parasitic Diseases, Chinese Center for Disease Control and Prevention (16 publications), Anhui Institute of Schistosomiasis Control (12 publications) and Hunan Institute of Schistosomiasis Control (9 publications) ranked top three affiliations in number of publications. A total of 157 persons were co-authored in these 94 publications, and Wang, Zhou and Zhang ranked top 3 authors in number of publications. The integrated schistosomiasis control strategy with emphasis on infectious source control has been widely implemented in China, and the achievements obtained from the implementation of this strategy should be summarized and transmitted internationally.

  10. Sources of political violence, political and psychological analysis

    Directory of Open Access Journals (Sweden)

    O. B. Balatska

    2015-05-01

    We also consider the following approaches to determining the nature and sources of aggression and violence such as instinktyvizm (K. Lorenz and behaviorism (J. B. Watson and B. F. Skinner et al.. Special attention is paid to theories of frustration aggression (J. Dollard, N. E. Miller, L. Berkowitz et al., according to which the causes of aggression and violence are hidden in a particular mental state – frustration. The particular importance of the theory of T. R. Gurr, in which the source of aggression and political violence are defined through the concept of relative deprivation, is underlined. Another approach is described in the article ­ the concept of aggression as a learned reaction (A. Bandura, G. Levin, B. Fleischmann et al.. Supporters of this approach believe that aggressive behavior is formed in the process of social training.

  11. Neutron activation analysis on sediments from Victoria Land, Antarctica. Multi-elemental characterization of potential atmospheric dust sources

    International Nuclear Information System (INIS)

    Baccolo, G.; Maggi, V.; Baroni, C.; Clemenza, M.; Motta, A.; Nastasi, M.; Previtali, E.; University of Milano-Bicocca, Milan; Delmonte, B.; Salvatore, M.C.

    2014-01-01

    The elemental composition of 40 samples of mineral sediments collected in Victoria Land, Antarctica, in correspondence of ice-free sites, is presented. Concentration of 36 elements was determined by instrumental neutron activation analysis, INAA. The selection of 6 standard reference materials and the development of a specific analytical procedure allowed to reduce measurements uncertainties and to verify the reproducibility of the results. The decision to analyze sediment samples from Victoria Land ice-free areas is related to recent investigations regarding mineral dust content in the TALos Dome ICE core (159deg11'E; 72deg49'S, East Antarctica, Victoria Land), in which a coarse local fraction of dust was recognized. The characterization of Antarctic potential source areas of atmospheric mineral dust is the first step to identify the active sources of dust for the Talos Dome area and to reconstruct the atmospheric pathways followed by air masses in this region during different climatic periods. Principal components analysis was used to identify elements and samples correlations; attention was paid specially to rare earth elements (REE) and incompatible/compatible elements (ICE) in respect to iron, which proved to be the most discriminating elemental groups. The analysis of REE and ICE concentration profiles supported evidences of chemical weathering in ice-free areas of Victoria Land, whereas cold and dry climate conditions of the Talos Dome area and in general of East Antarctica. (author)

  12. Sources to the landscape - detailed spatiotemporal analysis of 200 years Danish landscape dynamics using unexploited historical maps and aerial photos

    DEFF Research Database (Denmark)

    Svenningsen, Stig Roar; Christensen, Andreas Aagaard; Dupont, Henrik

    to declassification of military maps and aerial photos from the cold war, only relatively few sources have been made available to researchers due to lacking efforts in digitalization and related services. And even though the digitizing of cartographic material has been accelerated, the digitally available materials...... or to the commercial photo series from the last 20 years. This poster outlines a new research project focusing on the potential of unexploited cartographic sources for detailed analysis of the dynamic of the Danish landscape between 1800 – 2000. The project draws on cartographic sources available in Danish archives...... of material in landscape change studies giving a high temporal and spatial resolution. The project also deals with the opportunity and constrain of comparing different cartographic sources with diverse purpose and time of production, e.g. different scale and quality of aerial photos or the difference between...

  13. The error sources appearing for the gamma radioactive source measurement in dynamic condition

    International Nuclear Information System (INIS)

    Sirbu, M.

    1977-01-01

    The error analysis for the measurement of the gamma radioactive sources, placed on the soil, with the help of the helicopter are presented. The analysis is based on a new formula that takes account of the attenuation gamma ray factor in the helicopter walls. They give a complete error formula and an error diagram. (author)

  14. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    Science.gov (United States)

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  15. Objective and expert-independent validation of retinal image registration algorithms by a projective imaging distortion model.

    Science.gov (United States)

    Lee, Sangyeol; Reinhardt, Joseph M; Cattin, Philippe C; Abràmoff, Michael D

    2010-08-01

    Fundus camera imaging of the retina is widely used to diagnose and manage ophthalmologic disorders including diabetic retinopathy, glaucoma, and age-related macular degeneration. Retinal images typically have a limited field of view, and multiple images can be joined together using an image registration technique to form a montage with a larger field of view. A variety of methods for retinal image registration have been proposed, but evaluating such methods objectively is difficult due to the lack of a reference standard for the true alignment of the individual images that make up the montage. A method of generating simulated retinal images by modeling the geometric distortions due to the eye geometry and the image acquisition process is described in this paper. We also present a validation process that can be used for any retinal image registration method by tracing through the distortion path and assessing the geometric misalignment in the coordinate system of the reference standard. The proposed method can be used to perform an accuracy evaluation over the whole image, so that distortion in the non-overlapping regions of the montage components can be easily assessed. We demonstrate the technique by generating test image sets with a variety of overlap conditions and compare the accuracy of several retinal image registration models. Copyright 2010 Elsevier B.V. All rights reserved.

  16. Classifier models and architectures for EEG-based neonatal seizure detection

    International Nuclear Information System (INIS)

    Greene, B R; Marnane, W P; Lightbody, G; Reilly, R B; Boylan, G B

    2008-01-01

    Neonatal seizures are the most common neurological emergency in the neonatal period and are associated with a poor long-term outcome. Early detection and treatment may improve prognosis. This paper aims to develop an optimal set of parameters and a comprehensive scheme for patient-independent multi-channel EEG-based neonatal seizure detection. We employed a dataset containing 411 neonatal seizures. The dataset consists of multi-channel EEG recordings with a mean duration of 14.8 h from 17 neonatal patients. Early-integration and late-integration classifier architectures were considered for the combination of information across EEG channels. Three classifier models based on linear discriminants, quadratic discriminants and regularized discriminants were employed. Furthermore, the effect of electrode montage was considered. The best performing seizure detection system was found to be an early integration configuration employing a regularized discriminant classifier model. A referential EEG montage was found to outperform the more standard bipolar electrode montage for automated neonatal seizure detection. A cross-fold validation estimate of the classifier performance for the best performing system yielded 81.03% of seizures correctly detected with a false detection rate of 3.82%. With post-processing, the false detection rate was reduced to 1.30% with 59.49% of seizures correctly detected. These results represent a comprehensive illustration that robust reliable patient-independent neonatal seizure detection is possible using multi-channel EEG

  17. Aluminium and copper analysis in metallic alloys by neutron activation analysis from an 241 Am-Be source

    International Nuclear Information System (INIS)

    Carvalho, J. de.

    1980-01-01

    Aluminium and copper have been determined in aluminium alloys by the method of activation with neutrons from an 241 Am-Be source of intensity 9,8 x 10 6 n/s. The activity induced due to reactions 27 Al (n, γ) 28 Al and 63 Cu (n, γ) 64 Cu have been measured with a NaI (Tl) detector coupled to a single channel system. In order to obtain the samples and standards of about the same composition, the material to be irradiated was powdered. In view of low intensity of neutron source it was necessary to use samples of up to 50 g. A series of preliminary irradiations were carried out to ensure that the geometry for the irradiation and for the counting are reproducible. The results have been compared with those obtained by chemical methods. Assuming that the results obtained by chemical method is exact, a maximum relative error of 3,6% is obtained by this method. The method has a good reproducibility. The time needed for analysis of aluminium and copper are 18 min and 2 hours 40 minutes respectively. Four different samples were analysed. The average of five measurements for one of the samples was: 88.0% for aluminium and 10.0% for copper. The standard deviation and coefficient of variation were 0,8 and 1.0% for aluminium and 0,2 and 2.0% for copper. (author)

  18. Analysis of the environmental behavior of farmers for non-point source pollution control and management in a water source protection area in China.

    Science.gov (United States)

    Wang, Yandong; Yang, Jun; Liang, Jiping; Qiang, Yanfang; Fang, Shanqi; Gao, Minxue; Fan, Xiaoyu; Yang, Gaihe; Zhang, Baowen; Feng, Yongzhong

    2018-08-15

    The environmental behavior of farmers plays an important role in exploring the causes of non-point source pollution and taking scientific control and management measures. Based on the theory of planned behavior (TPB), the present study investigated the environmental behavior of farmers in the Water Source Area of the Middle Route of the South-to-North Water Diversion Project in China. Results showed that TPB could explain farmers' environmental behavior (SMC=0.26) and intention (SMC=0.36) well. Furthermore, the farmers' attitude towards behavior (AB), subjective norm (SN), and perceived behavioral control (PBC) positively and significantly influenced their environmental intention; their environmental intention further impacted their behavior. SN was proved to be the main key factor indirectly influencing the farmers' environmental behavior, while PBC had no significant and direct effect. Moreover, environmental knowledge following as a moderator, gender and age was used as control variables to conduct the environmental knowledge on TPB construct moderated mediation analysis. It demonstrated that gender had a significant controlling effect on environmental behavior; that is, males engage in more environmentally friendly behaviors. However, age showed a significant negative controlling effect on pro-environmental intention and an opposite effect on pro-environmental behavior. In addition, environmental knowledge could negatively moderate the relationship between PBC and environmental intention. PBC had a greater impact on the environmental intention of farmers with poor environmental knowledge, compared to those with plenty environmental knowledge. Altogether, the present study could provide a theoretical basis for non-point source pollution control and management. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Analysis of particle sources by interferometry in a three-body final state

    International Nuclear Information System (INIS)

    Humbert, P.

    1984-01-01

    This work presents the set-up of an original interferometrical method the aim of which is to access the intrinsic parameters (lifetime or natural width) of intermediate resonances created during nuclear collisions. The technic is based on the overlap of two events in the same detection, and shows some analogies with the interferometrical measurements based on the HANBURY-BROWN, TWISS effect. It applies to reactions leading to a three particle final state for which at least two particles are identical. The considered reactions are 11 B(α, 7 Li)αα; 12 C( 16 0,α) 12 C 12 C, 11 B(p,α)αα in which the intermediate source is respectively a level of 11 B*, 16 0*, 8 Be*. The results are in qualitative agreement with such an analysis [fr

  20. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good